LiteLLM
Open-sourcePython SDK and proxy server to call 100+ LLM APIs in OpenAI format with cost tracking and load balancing.
About LiteLLM
LiteLLM is a Python SDK and proxy server that provides a unified interface to call 100+ LLM APIs in OpenAI-compatible format. It includes built-in cost tracking, load balancing, guardrails, and logging for managing multi-provider LLM deployments, with 44.3k GitHub stars.
Best For
- Teams managing multiple LLM providers through a single gateway
- Organizations tracking AI costs across departments
Pros & Cons
Pros
- + Single API for all major LLM providers reduces vendor lock-in
- + Built-in cost tracking prevents unexpected API bills
- + Open source with 44.3k GitHub stars and active development
Cons
- - Configuration complexity for multi-provider setups
- - Additional latency from proxy layer in some configurations
Pricing
Open source and free to self-host
Key Features
- Unified interface for 100+ LLM APIs in OpenAI format
- Built-in cost tracking, logging, and budget management
- Load balancing across multiple providers and models
- Proxy server for centralizing LLM access in organizations
Similar Tools
Related AI Tools
OpenHands
AI-driven development tool that assists with autonomous coding tasks using multiple AI models.
AutoGPT
Open-source autonomous AI agent framework for building and deploying self-directing AI applications.
MetaGPT
A multi-agent framework for AI software development with role-based agent collaboration.
Deer Flow
An open-source long-horizon SuperAgent framework that researches, codes, and creates with subagent orchestration.
SWE Agent
AI agent that automatically fixes GitHub issues using language models with NeurIPS 2024 recognition.
E2B
Open-source secure sandboxed environment with real-world tools for enterprise-grade AI agent development.