litellm
litellm
¶
LiteLLM inference engine — unified access to 100+ LLM providers.
Classes¶
LiteLLMEngine
¶
Bases: InferenceEngine
Inference via LiteLLM — routes to any supported provider.
LiteLLM normalizes all providers (OpenAI, Anthropic, Google, DeepSeek,
Groq, Together, Fireworks, OpenRouter, Mistral, Cohere, xAI, Perplexity,
etc.) to OpenAI-format input/output. Model selection uses LiteLLM's
provider/model convention, e.g. anthropic/claude-sonnet-4-20250514.
API keys are read from environment variables following each provider's convention (OPENAI_API_KEY, ANTHROPIC_API_KEY, GROQ_API_KEY, etc.).