Displays a stream of logs from an Inference Endpoint. The logs include information about the deployment’s status, the model being served, and the requests being processed.Documentation Index
Fetch the complete documentation index at: https://docs.flex.ai/llms.txt
Use this file to discover all available pages before exploring further.
Usage
Arguments
| Argument | Type | Required | Description |
|---|---|---|---|
inference_endpoint_name | string | Yes | The name of the Inference Endpoint to view logs for. |
Flags
| Flag | Short | Type | Description |
|---|---|---|---|
--help | -h | boolean | Displays this help page. |
--no-color | boolean | Disables color formatting in the log output. | |
--verbose | -v | boolean | Provides more detailed output when viewing logs. |