After the supply chain attack, here are some litellm alternatives

Reddit r/LocalLLaMA / 3/25/2026

💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical Usage

Key Points

  • The article warns that litellm versions 1.82.7 and 1.82.8 on PyPI were compromised with credential-stealing malware via a supply-chain attack.
  • It recommends open-source alternatives, starting with Bifrost, a Go-based litellm replacement that claims substantially lower latency and uses a simple base-URL migration.
  • It highlights Kosong as an LLM abstraction layer designed for more agent-oriented workflows, unifying message formats and async tool orchestration across multiple providers.
  • It also suggests Helicone as an AI gateway focused on analytics, debugging, and observability, supporting many providers but with higher overhead than the first two options.
After the supply chain attack, here are some litellm alternatives

litellm versions 1.82.7 and 1.82.8 on PyPI were compromised with credential-stealing malware.

And here are a few open-source alternatives:

1. Bifrost: Probably the most direct litellm replacement right now. Written in Go, claims ~50x faster P99 latency than litellm. Apache 2.0 licensed, supports 20+ providers. Migration from litellm only requires a one-line base URL change.

2. Kosong: An LLM abstraction layer open-sourced by Kimi, used in Kimi CLI. More agent-oriented than litellm. it unifies message structures and async tool orchestration with pluggable chat providers. Supports OpenAI, Anthropic, Google Vertex and other API formats.

3. Helicone: An AI gateway with strong analytics and debugging capabilities. Supports 100+ providers. Heavier than the first two but more feature-rich on the observability side.

submitted by /u/KissWild
[link] [comments]

After the supply chain attack, here are some litellm alternatives | AI Navigate