An open-source LLM gateway that enforces policies, rate limits, and spend caps on every request—without changing your code.
When developers use LLM APIs directly, anyone with a key can use any model, costs spiral unpredictably, there's no audit trail, and you can't enforce rules. Ziri fixes this by acting as a smart gateway every request must pass through.
LLM usage inside teams is chaotic by default—shared API keys, no policy layer, no cost controls, and no audit trail.
Ziri was built to be the missing control plane: a lightweight, self-hosted gateway that teams can run themselves and extend in the open.
Centralized control over every LLM API request, without changing how your applications are built.
Cedar policies give you declarative control over every LLM request.
Per-user keys with automatic rotation. No more shared provider keys.
Sliding window throttling per user. Stop runaway scripts.
Per-user and per-team spend summaries with configurable caps.
Full record of every authorization decision.
Manage keys, policies, and usage from a browser. Bundled with the gateway.
Admin, Viewer, User Admin, Policy Admin. Right-sized access for every role.
Fork it, extend it, make it yours.
Plain text files, version-controlled, git-friendly.
Zero dependencies—everything stored in a single file.
No separate install, no extra infrastructure.
Easy to read, easy to contribute to.
Every LLM request flows through five stages. Your apps don't change — they just point to Ziri instead of OpenAI or Anthropic directly.
Ziri ships as a single Docker image. Run it anywhere — your cloud, your datacenter, your laptop. The Admin UI is bundled, no separate setup required.
Define access rules in Cedar — the same policy language used by AWS. Readable, testable, and version-controlled alongside your code.
The @ziri/sdk
npm package gives you programmatic control over keys, policies, usage, and more.
Deploy Ziri in minutes and start governing your team's LLM API usage. Open source, self-hosted, and free to get started.