- Support LLM_PROVIDER env var to override default provider (openai/gpt-4o-mini) - Add optional 'provider' parameter to API endpoints for per-request overrides - Implement provider validation to ensure API keys exist - Update documentation and examples with new configuration options Closes the need to hardcode providers in config.yml
13 lines
534 B
Plaintext
13 lines
534 B
Plaintext
# LLM Provider Keys
|
|
OPENAI_API_KEY=your_openai_key_here
|
|
DEEPSEEK_API_KEY=your_deepseek_key_here
|
|
ANTHROPIC_API_KEY=your_anthropic_key_here
|
|
GROQ_API_KEY=your_groq_key_here
|
|
TOGETHER_API_KEY=your_together_key_here
|
|
MISTRAL_API_KEY=your_mistral_key_here
|
|
GEMINI_API_TOKEN=your_gemini_key_here
|
|
|
|
# Optional: Override the default LLM provider
|
|
# Examples: "openai/gpt-4", "anthropic/claude-3-opus", "deepseek/chat", etc.
|
|
# If not set, uses the provider specified in config.yml (default: openai/gpt-4o-mini)
|
|
# LLM_PROVIDER=anthropic/claude-3-opus |