☁️ Model Providers
Code Puppy supports multiple AI providers and models. Use your favorite models or mix and match!
Supported Providers
GPT-4, GPT-4o, GPT-3.5-turbo, and more.
Env: OPENAI_API_KEY
Claude 3.5 Sonnet, Claude 3 Opus, Haiku.
Env: ANTHROPIC_API_KEY
Gemini Pro, Gemini Ultra, and more.
Env: GOOGLE_API_KEY
Ultra-fast inference with Cerebras chips.
Env: CEREBRAS_API_KEY
Mistral Large, Medium, and open models.
Env: MISTRAL_API_KEY
Fast LLM inference with LPU chips.
Env: GROQ_API_KEY
Setting Up API Keys
Option 1: Using /set Command (Recommended)
Set API keys directly in Code Puppy using lowercase key names. Keys are saved to puppy.cfg and persist across sessions:
/set openai_api_key sk-your-openai-key
/set anthropic_api_key sk-ant-your-anthropic-key
/set google_api_key your-google-key
/set cerebras_api_key your-cerebras-key
/set mistral_api_key your-mistral-key
/set groq_api_key your-groq-key
Keys set with /set are stored in ~/.code_puppy/puppy.cfg and loaded automatically. Much easier than managing environment variables!
Option 2: Environment Variables
Alternatively, use environment variables (add to ~/.bashrc or ~/.zshrc):
export OPENAI_API_KEY="sk-your-openai-key"
export ANTHROPIC_API_KEY="sk-ant-your-anthropic-key"
export GOOGLE_API_KEY="your-google-key"
export CEREBRAS_API_KEY="your-cerebras-key"
export MISTRAL_API_KEY="your-mistral-key"
export GROQ_API_KEY="your-groq-key"
Switching Models
# Switch model on the fly
/model gpt-5.1
/model claude-4-5-sonnet
/model Gemini-3
# List available models
/model
# Add models from catalog
/add_model
Popular Models
| Provider | Model | Best For |
|---|---|---|
| OpenAI | gpt-5.1 |
General purpose, fast |
| OpenAI | gpt-5.1-codex-api |
Complex reasoning |
| Anthropic | claude-4-5-sonnet |
Code generation, analysis |
| Anthropic | claude-4-5-opus |
Most capable Claude |
| Anthropic | claude-4-5-haiku |
Fast, cheap |
Gemini-3 |
General purpose | |
| Cerebras | Cerebras-GLM-4.6 |
Very fast inference |
| Groq | synthetic-GLM-4.6 |
Fast open models |
Round-Robin Mode
Distribute requests across multiple models for load balancing or cost optimization:
# In puppy.cfg
[round_robin]
enabled = true
models = gpt-5.1, claude-4-5-sonnet, Gemini-3
Adding Custom Models
Add custom models via ~/.code_puppy/extra_models.json:
{
"models": [
{
"name": "my-local-llama",
"provider": "openai",
"model_id": "Cerebras-GLM-4.6",
"api_base": "http://localhost:8000/v1"
},
{
"name": "azure-gpt4",
"provider": "azure",
"model_id": "gpt-5.1",
"api_base": "https://my-deployment.openai.azure.com",
"api_version": "2024-02-15-preview"
}
]
}
Per-Model Settings
Configure settings for specific models:
/model_settings
# Adjust temperature, max tokens, etc.
# Settings are saved per-model
Model Pinning
Pin specific models to agents:
# Pin claude to the code-reviewer agent
/agent code-reviewer
/pin_model claude-4-5-sonnet
# Now code-reviewer always uses Claude
# Other agents use your default model