Skip to content

inference logs

Displays a stream of logs from an Inference Endpoint. The logs include information about the deployment’s status, the model being served, and the requests being processed.

Terminal window
flexai inference logs <inference_endpoint_name>
Required

The name of the Inference Endpoint to view logs for.

Examples
  • mixtral_8x7b