Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.flex.ai/llms.txt

Use this file to discover all available pages before exploring further.

Displays a stream of logs from an Inference Endpoint. The logs include information about the deployment’s status, the model being served, and the requests being processed.

Usage

flexai inference logs <inference_endpoint_name> [flags]

Arguments

ArgumentTypeRequiredDescription
inference_endpoint_namestringYesThe name of the Inference Endpoint to view logs for.

Flags

FlagShortTypeDescription
--help-hbooleanDisplays this help page.
--no-colorbooleanDisables color formatting in the log output.
--verbose-vbooleanProvides more detailed output when viewing logs.