Skip to Content
prxy.monster v1 is in early access. See what shipped →
IntegrationsUsing prxy.monster with Cursor

Using prxy.monster with Cursor

Cursor supports custom OpenAI-compatible endpoints. Set the base URL to prxy.monster and your custom API key, and every “ask” / “edit” / “agent” request routes through prxy.monster.

Configure

  1. Open Cursor → SettingsModels
  2. Scroll to OpenAI API Key section
  3. Click Override OpenAI Base URL
  4. Enter:
    https://api.prxy.monster/v1
  5. Enter your prxy.monster key in the OpenAI API Key field:
    prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx
  6. Click Verify — Cursor will make a test call.

That’s it. Cursor now routes through prxy.monster.

Cursor’s setting label specifically says “OpenAI” — but it’ll send any OpenAI-compatible request shape, which is what prxy.monster speaks. Models like gpt-4o, claude-sonnet-4-6 (via prxy’s translator), and others all work.

Code change

None. Cursor is an IDE; you don’t touch its code.

Custom model selection

In the Models settings, you can also enable “Custom Models” and add model identifiers that prxy.monster understands. Examples:

  • gpt-4o
  • gpt-4o-mini
  • claude-sonnet-4-6 (prxy.monster translates Anthropic models behind the OpenAI shape)
  • gemini-2.0-flash
  • bedrock/<model-id> — for AWS Bedrock routing

Verify

curl https://api.prxy.monster/health

Or trigger any Cursor agent action — successful response confirms routing.

What you get

  • Pattern memory across all your projects — Cursor’s “agent” mode generates many similar requests across projects. The patterns module learns from successes and re-injects them.
  • Semantic cache — repeated “explain this function” / “refactor this” style requests hit cache.
  • Cost guards — hard daily cap on Cursor’s API spend (Cursor’s own pricing tier doesn’t enforce this for BYOK users).
  • MCP optimization — if you wire MCP into Cursor, the mcp-optimizer module prunes irrelevant tools per request.
PRXY_PIPE=mcp-optimizer,semantic-cache,patterns,ipc

For users on Cursor Pro who want hard cost caps:

PRXY_PIPE=cost-guard,exact-cache,semantic-cache,patterns,ipc

Common issues

  • “Cursor uses its own backend by default” — true for the free tier and for default models. The Custom OpenAI Base URL setting only kicks in if you’ve enabled a custom model OR opted to use your own key. Check both.
  • “My Cursor Pro subscription stopped counting” — using the Custom OpenAI key path means you BYOK; Cursor stops billing you per-request and your prxy.monster (and underlying provider) bill is the source of truth.
  • Inline edits / Composer — both use the same backend wiring as Chat. Same routing applies.

What this does NOT change

Cursor’s local features — fast tab completion, codebase indexing, file search — happen client-side and don’t go through any LLM endpoint. Only the chat / agent / inline-edit calls route through prxy.monster.

Full example

No code example needed — Cursor is a GUI-only setup. The Settings → Models flow above is the entire integration.

Cursor’s settings UI shifts occasionally. If “Override OpenAI Base URL” isn’t where this guide says, check the Cursor docs  for the current path. The setting itself has been stable for a long time.

Last updated on