LLM Providers
Unified configuration for multiple LLM providers. Switch between OpenAI, Anthropic, Google, and Ollama without changing your code.
Supported Providers
| Provider | Models | Streaming |
|---|---|---|
| OpenAI | GPT-4, GPT-4 Turbo, GPT-3.5 Turbo, o1 | Yes |
| Anthropic | Claude 3 Opus, Sonnet, Haiku | Yes |
| Gemini Pro, Gemini 1.5 Pro/Flash | Yes | |
| Ollama | Llama 3, Mistral, Mixtral, custom | Yes |
Configuration
Configure providers in sekuire.yml:
sekuire.yml
agents:
assistant:
name: "AI Assistant"
system_prompt: "./prompts/assistant.md"
llm:
provider: "openai" # openai | anthropic | google | ollama
model: "gpt-4-turbo"
api_key_env: "OPENAI_API_KEY"
temperature: 0.7
max_tokens: 4096
OpenAI
sekuire.yml
llm:
provider: "openai"
model: "gpt-4-turbo"
api_key_env: "OPENAI_API_KEY"
organization_env: "OPENAI_ORG_ID" # optional
temperature: 0.7
max_tokens: 4096
top_p: 1.0
Anthropic
sekuire.yml
llm:
provider: "anthropic"
model: "claude-3-opus-20240229"
api_key_env: "ANTHROPIC_API_KEY"
temperature: 0.7
max_tokens: 4096
Google
sekuire.yml
llm:
provider: "google"
model: "gemini-1.5-pro"
api_key_env: "GOOGLE_API_KEY"
temperature: 0.7
max_tokens: 8192
Ollama (Local)
sekuire.yml
llm:
provider: "ollama"
model: "llama3:70b"
base_url: "http://localhost:11434" # default
temperature: 0.7
Programmatic Configuration
- TypeScript
- Python
- Rust
import { getAgent, OpenAIProvider, AnthropicProvider } from '@sekuire/sdk';
// Override config provider at runtime
const agent = await getAgent('assistant', {
llm: new OpenAIProvider({
model: 'gpt-4-turbo',
apiKey: process.env.OPENAI_API_KEY,
temperature: 0.5
})
});
// Or switch providers dynamically
const claudeAgent = await getAgent('assistant', {
llm: new AnthropicProvider({
model: 'claude-3-opus-20240229',
apiKey: process.env.ANTHROPIC_API_KEY
})
});
from sekuire_sdk import get_agent
from sekuire_sdk.providers import OpenAIProvider, AnthropicProvider
# Override config provider at runtime
agent = await get_agent("assistant", llm=OpenAIProvider(
model="gpt-4-turbo",
api_key=os.environ["OPENAI_API_KEY"],
temperature=0.5
))
# Or switch providers dynamically
claude_agent = await get_agent("assistant", llm=AnthropicProvider(
model="claude-3-opus-20240229",
api_key=os.environ["ANTHROPIC_API_KEY"]
))
use sekuire_sdk::{get_agent, providers::{OpenAIProvider, AnthropicProvider}};
let agent = get_agent(Some("assistant"), Some(AgentOptions {
llm: Some(Box::new(OpenAIProvider::new(
"gpt-4-turbo",
std::env::var("OPENAI_API_KEY")?,
)?)),
..Default::default()
})).await?;
Next Steps
- Agent API - Using agents with providers
- Streaming - Stream responses from providers
- Config Schema - Full configuration reference