Migrating from OpenRouter
OpenRouter is a great router. prxy.monster adds composable modules — caching, cost guards, persistent learning — on top of routing.
What’s the same
- One key for many providers.
- OpenAI-compatible API.
- Drop-in
BASE_URLreplacement. - BYOK supported.
What’s different
| OpenRouter | prxy.monster | |
|---|---|---|
| Routing | Built-in fallback chains | router module (v1.1); explicit until then |
| Caching | None at gateway layer | exact-cache, semantic-cache |
| Cost limits | Per-key budget | cost-guard per-request/day/month |
| Persistent memory | None | patterns module |
| Local mode | No | Yes — single Docker container |
| Token markup | Yes (~5–10%) | No — BYOK, gateway tier billed separately |
Diff: env vars
- OPENAI_BASE_URL=https://openrouter.ai/api/v1
- OPENAI_API_KEY=sk-or-xxx
+ OPENAI_BASE_URL=https://api.prxy.monster/v1
+ OPENAI_API_KEY=prxy_live_xxx
+ # Set your provider key in the dashboard or pass it per-request via headers.If you were using OpenRouter for Anthropic models specifically:
- (No Anthropic-shape support on OpenRouter — you used /chat/completions for everything)
+ ANTHROPIC_BASE_URL=https://api.prxy.monster
+ ANTHROPIC_API_KEY=prxy_live_xxxYou can now use the native Anthropic SDK against /v1/messages, not just the OpenAI shape.
Migrating fallback chains
OpenRouter format:
{
"model": "anthropic/claude-sonnet-4-6",
"models": ["openai/gpt-4o", "google/gemini-2.0-pro"]
}prxy.monster (when router module ships in v1.1):
pipeline:
- router:
strategy: 'fallback'
fallback_chain:
- claude-sonnet-4-6
- gpt-4o
- gemini-2.0-proFor v1, configure fallback at the SDK level until the router module ships.
Migrating route-by-cost
OpenRouter has :nitro and :floor suffixes. prxy.monster accomplishes the same with explicit configuration:
pipeline:
- cost-guard:
perRequest: 0.005 # only allow cheap models
- router: # v1.1 — picks cheapest available
strategy: 'cheapest-first'What you gain
- Caching — a 30% cache hit rate on a support workload turns into 30% cost savings, immediately.
- Cost caps that work — hard daily ceiling, not just monthly.
- Local mode — run the whole thing on your hardware for sensitive workloads.
- Persistent memory — your AI gets smarter at your specific use case.
What you lose (and how we make up for it)
| OpenRouter feature | prxy.monster equivalent |
|---|---|
| Provider markup absorbed in their pricing | BYOK + flat gateway tier — usually cheaper at volume |
:online (web search) | Add a custom module or use a provider’s native tool-calling |
| Public usage stats per model | (v1.1) /v1/usage endpoint with per-model rollups |
Switch in 60 seconds
# 1. Sign up at prxy.monster
# 2. Copy your API key
# 3. Update env:
export OPENAI_BASE_URL=https://api.prxy.monster/v1
export OPENAI_API_KEY=prxy_live_xxxxxxxxxxxx
# 4. (One time) Add your provider keys at app.prxy.monster/providersYour code doesn’t change.
Already have caching/learning needs you’ve been hacking around in your app code? Move them into a custom module. See SDK → Module interface.
Last updated on