Skip to Content
prxy.monster v1 is in early access. See what shipped →
QuickstartQuick start — Cloud

Quick start — Cloud

Up and running in under 90 seconds. The only thing you change in your app is one env var.

Sign up

Create a free account at prxy.monster . No credit card required for the free tier.

Mint an API key

In the dashboard at app.prxy.monster/keys , click + New key.

Copy the key — it’s only shown once. Keys look like:

prxy_live_a1b2c3d4e5f6...

Keep this secret. Treat it like a provider API key. Revoke it from the dashboard if it leaks.

Point your app at the gateway

Two env vars. That’s the whole integration.

export ANTHROPIC_BASE_URL=https://api.prxy.monster export ANTHROPIC_API_KEY=prxy_live_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
import Anthropic from '@anthropic-ai/sdk'; // SDK reads ANTHROPIC_BASE_URL + ANTHROPIC_API_KEY from env. const client = new Anthropic(); const msg = await client.messages.create({ model: 'claude-sonnet-4-6', max_tokens: 256, messages: [{ role: 'user', content: 'hi' }], });

Verify

curl https://api.prxy.monster/v1/pipeline \ -H "Authorization: Bearer $ANTHROPIC_API_KEY"

Returns the active module pipeline for your key. Default:

{ "configured": [], "active": [ { "name": "mcp-optimizer", "version": "1.0.0" }, { "name": "semantic-cache", "version": "1.0.0" }, { "name": "patterns", "version": "1.0.0" } ], "override": null }

What just happened?

Every request through api.prxy.monster ran through the default pipeline:

ModuleWhat it did
mcp-optimizerEmbedded each MCP tool’s description. Kept only the ones relevant to your prompt.
semantic-cacheEmbedded the request, looked for similar past requests. Skipped the provider call on a hit.
patternsScanned the response for “the fix is X” type insights and saved them.

You can replace any of those, add more, or strip them down to nothing.

Next steps

Bring Your Own Key (BYOK) — your provider key (Anthropic / OpenAI / etc.) lives in your app, not on our servers. Tokens are billed directly by the provider. We charge for the gateway, never for tokens. See pricing.

Provider options

prxy.monster speaks five upstream providers out of the box:

  • Anthropicclaude-* models
  • OpenAIgpt-*, o1, o3, o4 models
  • Googlegemini-* models
  • Groqllama-*, mixtral-*, groq/* models
  • AWS Bedrockbedrock/<model-id> (Claude, Llama, Titan, Mistral, Cohere) — see Bedrock provider

The model field on the request is the routing signal — no provider config to set per request.

Last updated on