Ziri is a self-hosted proxy that sits between your apps and AI providers. Govern access, control costs, and enforce policies — without changing a line of application code.
When developers use LLM APIs directly, anyone with a key can use any model, costs spiral unpredictably, there's no audit trail, and you can't enforce rules. Ziri fixes this by acting as a smart gateway every request must pass through.
Centralized control over every AI API request, without changing how your applications are built.
Define who can use what, when. Cedar policies give you granular, declarative control over every AI request.
Per-user keys with automatic rotation. No more sharing raw provider keys across your team.
Sliding window throttling per user. Prevent runaway scripts from burning through your budget.
Daily and monthly spend summaries per user and team. Set spend caps before budgets run out.
Full record of every authorization decision. Know who accessed which model, when, and why.
Manage keys, policies, users, and view usage from a browser. Bundled with the gateway — no separate install, no extra infrastructure.
Admin, Viewer, User Admin, Policy Admin. The right level of access for every team member.
Every AI request flows through five stages. Your apps don't change — they just point to Ziri instead of OpenAI or Anthropic directly.
Ziri ships as a single Docker image. Run it anywhere — your cloud, your datacenter, your laptop. The Admin UI is bundled, no separate setup required.
Define access rules in Cedar — the same policy language used by AWS. Readable, testable, and version-controlled alongside your code.
The @ziri/sdk
npm package gives you programmatic control over keys, policies, usage, and more.
Deploy Ziri in minutes and start governing your team's AI API usage. Open source, self-hosted, and free to get started.