Skip to main content

Model providers

This page covers LLM/model providers (not chat providers like WhatsApp/Telegram). For model selection rules, see /concepts/models.

Quick rules

  • Model refs use provider/model (example: opencode/claude-opus-4-5).
  • If you set agents.defaults.models, it becomes the allowlist.
  • CLI helpers: clawdbot onboard, clawdbot models list, clawdbot models set <provider/model>.

Built-in providers (pi-ai catalog)

Clawdbot ships with the pi‑ai catalog. These providers require no models.providers config; just set auth + pick a model.

OpenAI

  • Provider: openai
  • Auth: OPENAI_API_KEY
  • Example model: openai/gpt-5.2
  • CLI: clawdbot onboard --auth-choice openai-api-key
{
  agents: { defaults: { model: { primary: "openai/gpt-5.2" } } }
}

Anthropic

  • Provider: anthropic
  • Auth: ANTHROPIC_API_KEY or claude setup-token
  • Example model: anthropic/claude-opus-4-5
  • CLI: clawdbot onboard --auth-choice setup-token
{
  agents: { defaults: { model: { primary: "anthropic/claude-opus-4-5" } } }
}

OpenAI Code (Codex)

  • Provider: openai-codex
  • Auth: OAuth or Codex CLI (~/.codex/auth.json)
  • Example model: openai-codex/gpt-5.2
  • CLI: clawdbot onboard --auth-choice openai-codex or codex-cli
{
  agents: { defaults: { model: { primary: "openai-codex/gpt-5.2" } } }
}

OpenCode Zen

  • Provider: opencode
  • Auth: OPENCODE_API_KEY (or OPENCODE_ZEN_API_KEY)
  • Example model: opencode/claude-opus-4-5
  • CLI: clawdbot onboard --auth-choice opencode-zen
{
  agents: { defaults: { model: { primary: "opencode/claude-opus-4-5" } } }
}

Google Gemini (API key)

  • Provider: google
  • Auth: GEMINI_API_KEY
  • Example model: google/gemini-3-pro
  • CLI: clawdbot onboard --auth-choice gemini-api-key

Google Vertex / Antigravity / Gemini CLI

  • Providers: google-vertex, google-antigravity, google-gemini-cli
  • Auth: Vertex uses gcloud ADC; Antigravity/Gemini CLI use their respective auth flows
  • CLI: clawdbot onboard --auth-choice antigravity (others via interactive wizard)

Z.AI (GLM)

  • Provider: zai
  • Auth: ZAI_API_KEY
  • Example model: zai/glm-4.7
  • CLI: clawdbot onboard --auth-choice zai-api-key
    • Aliases: z.ai/* and z-ai/* normalize to zai/*

Other built-in providers

  • OpenRouter: openrouter (OPENROUTER_API_KEY)
  • Example model: openrouter/anthropic/claude-sonnet-4-5
  • xAI: xai (XAI_API_KEY)
  • Groq: groq (GROQ_API_KEY)
  • Cerebras: cerebras (CEREBRAS_API_KEY)
  • Mistral: mistral (MISTRAL_API_KEY)
  • GitHub Copilot: github-copilot (COPILOT_GITHUB_TOKEN / GH_TOKEN / GITHUB_TOKEN)

Providers via models.providers (custom/base URL)

Use models.providers (or models.json) to add custom providers or OpenAI/Anthropic‑compatible proxies.

MiniMax

MiniMax is configured via models.providers because it uses custom endpoints:
  • MiniMax Cloud (OpenAI‑compatible): --auth-choice minimax-cloud
  • MiniMax API (Anthropic‑compatible): --auth-choice minimax-api
  • Auth: MINIMAX_API_KEY

Local proxies (LM Studio, vLLM, LiteLLM, etc.)

Example (OpenAI‑compatible):
{
  agents: {
    defaults: {
      model: { primary: "lmstudio/minimax-m2.1-gs32" },
      models: { "lmstudio/minimax-m2.1-gs32": { alias: "Minimax" } }
    }
  },
  models: {
    providers: {
      lmstudio: {
        baseUrl: "http://localhost:1234/v1",
        apiKey: "LMSTUDIO_KEY",
        api: "openai-completions",
        models: [
          {
            id: "minimax-m2.1-gs32",
            name: "MiniMax M2.1",
            reasoning: false,
            input: ["text"],
            cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
            contextWindow: 200000,
            maxTokens: 8192
          }
        ]
      }
    }
  }
}

CLI examples

clawdbot onboard --auth-choice opencode-zen
clawdbot models set opencode/claude-opus-4-5
clawdbot models list
See also: /gateway/configuration for full configuration examples.