Skip to main content
The flexai secret command allows for the management of Secrets using the FlexAI Secret Manager. Secrets are encrypted at rest and are only made available to Workload Runtimes as system environment variables. Any number of Secrets can be passed on to a Training Job using the -S/--secret flag:
flexai training run test-training-123 \
  --dataset open_web \
  --repository-url https://github.com/flexaihq/nanoGPT/ \
  --secret HF_TOKEN=hf_token_dev \
  --secret WANDB_API_KEY=wandb-key \
  -- train.py ...
The same way Secrets can be passed to when creating an Inference Endpoint using the --hf-token-secret or --api-key-secret flags:
flexai inference serve llm-text-inference-prod \
  --hf-token-secrethf_token_prod \
  --api-key-secret api-key-prod \
  -- --model=mistralai/Mistral-7B-Instruct-v0.1 ...
Commands that leverage the FlexAI Secret Manager include:

Available subcommands