litellm
UnknownPython SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]
Transport: sse
Reliability
—
30d Uptime
—
Avg Latency
—
P95 Latency
24h Health Timeline
No health check data yet
GitHub
41,861
Stars
392
Contributors
7019
Commits (90d)
4/2/2026
Last Commit
First commit: 7/27/2023
Data Provenance
nameInferred
descriptionInferred
Embed Reliability Badge
Markdown
HTML
<img src="https://yellowmcp.com/api/v1/servers/litellm/badge" alt="YellowMCP Reliability">