Create an account to get started. You can set up an org for your team later.
Credits can be used with any model or provider.
Create an API key and start making requests. Fully OpenAI compatible.
Access all major models through a single, unified interface. OpenAI SDK works out of the box.
Reliable AI models via our distributed infrastructure. Fall back to other providers when one goes down.
Keep costs in check without sacrificing speed. OpenRouter runs at the edge, adding just ~25ms between your users and their inference.
Protect your organization with fine grained data policies. Ensure prompts only go to the models and providers you trust.
114.5B
Tokens/wk
2.6s
Latency
+131.79%
Weekly growth
114.5B
Tokens/wk
2.6s
Latency
+131.79%
Weekly growth
291.0B
Tokens/wk
2.2s
Latency
-13.99%
Weekly growth
Centralize your LLM logic, iterate faster, and clean up your code—Presets are now live on OpenRouter.
Track model uptime via API and get more control over your BYOK setup—including usage limits and testable keys.
We’re rolling out a simpler and more transparent platform fee structure: