Using prxy.monster with Cursor
Cursor supports custom OpenAI-compatible endpoints. Set the base URL to prxy.monster and your custom API key, and every “ask” / “edit” / “agent” request routes through prxy.monster.
Configure
- Open Cursor → Settings → Models
- Scroll to OpenAI API Key section
- Click Override OpenAI Base URL
- Enter:
https://api.prxy.monster/v1 - Enter your prxy.monster key in the OpenAI API Key field:
prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx - Click Verify — Cursor will make a test call.
That’s it. Cursor now routes through prxy.monster.
Cursor’s setting label specifically says “OpenAI” — but it’ll send any OpenAI-compatible request shape, which is what prxy.monster speaks. Models like gpt-4o, claude-sonnet-4-6 (via prxy’s translator), and others all work.
Code change
None. Cursor is an IDE; you don’t touch its code.
Custom model selection
In the Models settings, you can also enable “Custom Models” and add model identifiers that prxy.monster understands. Examples:
gpt-4ogpt-4o-miniclaude-sonnet-4-6(prxy.monster translates Anthropic models behind the OpenAI shape)gemini-2.0-flashbedrock/<model-id>— for AWS Bedrock routing
Verify
curl https://api.prxy.monster/healthOr trigger any Cursor agent action — successful response confirms routing.
What you get
- Pattern memory across all your projects — Cursor’s “agent” mode generates many similar requests across projects. The
patternsmodule learns from successes and re-injects them. - Semantic cache — repeated “explain this function” / “refactor this” style requests hit cache.
- Cost guards — hard daily cap on Cursor’s API spend (Cursor’s own pricing tier doesn’t enforce this for BYOK users).
- MCP optimization — if you wire MCP into Cursor, the
mcp-optimizermodule prunes irrelevant tools per request.
Recommended pipeline
PRXY_PIPE=mcp-optimizer,semantic-cache,patterns,ipcFor users on Cursor Pro who want hard cost caps:
PRXY_PIPE=cost-guard,exact-cache,semantic-cache,patterns,ipcCommon issues
- “Cursor uses its own backend by default” — true for the free tier and for default models. The Custom OpenAI Base URL setting only kicks in if you’ve enabled a custom model OR opted to use your own key. Check both.
- “My Cursor Pro subscription stopped counting” — using the Custom OpenAI key path means you BYOK; Cursor stops billing you per-request and your prxy.monster (and underlying provider) bill is the source of truth.
- Inline edits / Composer — both use the same backend wiring as Chat. Same routing applies.
What this does NOT change
Cursor’s local features — fast tab completion, codebase indexing, file search — happen client-side and don’t go through any LLM endpoint. Only the chat / agent / inline-edit calls route through prxy.monster.
Full example
No code example needed — Cursor is a GUI-only setup. The Settings → Models flow above is the entire integration.
Cursor’s settings UI shifts occasionally. If “Override OpenAI Base URL” isn’t where this guide says, check the Cursor docs for the current path. The setting itself has been stable for a long time.