BACK_TO_FEEDAICRIER_2
Routerly launches self-hosted LLM gateway
OPEN_SOURCE ↗
REDDIT · REDDIT// 18d agoOPENSOURCE RELEASE

Routerly launches self-hosted LLM gateway

Routerly is a self-hosted, OpenAI-compatible LLM gateway that routes requests across providers like OpenAI, Anthropic, Gemini, Mistral, and Ollama. It adds per-project token tracking and hard budget caps, so teams can swap models at runtime without changing app code.

// ANALYSIS

This is a real infra play, not just another proxy wrapper. The pitch is attractive because it makes routing and spend control explicit, but the project will live or die on whether the policy engine beats simpler static rules in practice.

  • Nine policies cover cheapest, fastest, healthiest, most capable, context, fairness, rate limits, budget-remaining, and LLM-driven selection.
  • No database, Redis, or PostgreSQL makes it much easier to self-host on a small footprint.
  • OpenAI and Anthropic compatibility lowers adoption friction across Cursor, LangChain, Open WebUI, and similar clients.
  • The best fit is multi-tenant SaaS and local-first teams that need per-project tokens, hard caps, and failover.
  • The project is still early, and the site/repo licensing copy is inconsistent, both of which are rough edges serious users will notice fast.
// TAGS
routerlyllmself-hostedopen-sourceapidevtool

DISCOVERED

18d ago

2026-03-24

PUBLISHED

19d ago

2026-03-24

RELEVANCE

8/ 10

AUTHOR

nurge86