Skip to Content
prxy.monster v1 is in early access. See what shipped →
IntegrationsUsing prxy.monster with Cline

Using prxy.monster with Cline

Cline  (formerly Claude Dev, also forked as Roo Code) is a VS Code extension for autonomous coding agents. It supports custom OpenAI-compatible endpoints via its provider settings.

Configure

  1. Open VS Code → Cline extension → Settings (gear icon)
  2. Under API Provider, select OpenAI Compatible
  3. Set:
    • Base URL: https://api.prxy.monster/v1
    • API Key: prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx
    • Model ID: claude-sonnet-4-6 (or gpt-4o, gemini-2.0-flash, etc.)

That’s it. Click “Done” and start a new task. Cline routes every request through prxy.monster.

Cline can also use Anthropic as a direct provider. If you select that, set the Anthropic Base URL to https://api.prxy.monster instead. Same effect.

Code change

None — Cline is a VS Code extension.

Verify

curl https://api.prxy.monster/health

Start a Cline task with a simple prompt — successful response confirms routing.

What you get

Cline runs long autonomous agentic tasks, often involving many tool calls and multi-step plans. This is exactly the workload prxy.monster is built for:

  • MCP optimization — Cline can use MCP servers; the mcp-optimizer module prunes irrelevant tool defs per request, saving major tokens.
  • Pattern memory — Cline’s repeated workflows (“write a test for this”, “implement this PRD step”, “fix this lint error”) get learned across tasks.
  • Infinite context — long autonomous tasks stop hitting the context wall via the ipc module.
  • Cost guards — Cline can rack up serious cost on big tasks. The cost-guard module enforces a hard ceiling.
  • Semantic cache — Cline often re-reads / re-analyzes the same files; cached responses save tokens.
PRXY_PIPE=mcp-optimizer,semantic-cache,patterns,ipc,cost-guard

The cost-guard is especially important here — Cline can spend $5-20 per non-trivial task without one.

Common issues

  • “Cline’s request format” / “context size errors” — Cline ships large system prompts. The ipc module helps but can’t shrink an explicit system prompt mid-conversation. Make sure you’ve selected a model with a large enough context window.
  • Streaming — Cline uses streaming. Pass-through.
  • Tool use — Cline’s file-edit / shell-execute tools are client-side; the LLM just emits tool-call payloads. prxy.monster sees those and the mcp-optimizer doesn’t touch them (it only prunes MCP tools, not Cline’s built-ins).

Roo Code

Roo Code is a Cline fork with extra features. The settings layout is the same — same OpenAI Compatible / Anthropic Base URL fields.

Full example

GUI-only setup, no example repo needed. The screenshots-style walkthrough above is the entire integration.

Verify the current settings layout with the Cline docs  — VS Code extension UIs evolve. The OpenAI Compatible / Custom Base URL pattern has been stable.

Last updated on