Skip to Content
prxy.monster v1 is in early access. See what shipped →
IntegrationsUsing prxy.monster with Mastra

Using prxy.monster with Mastra

Mastra  is a TypeScript agent framework built on top of the Vercel AI SDK. Because Mastra uses @ai-sdk/openai and @ai-sdk/anthropic under the hood, the Vercel AI SDK integration applies directly.

Install

You probably have these already:

npm install @mastra/core @ai-sdk/anthropic @ai-sdk/openai

Configure

Set the standard env vars:

export ANTHROPIC_BASE_URL=https://api.prxy.monster export ANTHROPIC_API_KEY=prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx export OPENAI_BASE_URL=https://api.prxy.monster/v1 export OPENAI_API_KEY=prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx

Code change

None. Mastra agents constructed with model: anthropic('claude-sonnet-4-6') or model: openai('gpt-4o') will pick up the env vars automatically:

import { Agent } from '@mastra/core'; import { anthropic } from '@ai-sdk/anthropic'; const researcher = new Agent({ name: 'researcher', instructions: 'You are a thorough research assistant.', model: anthropic('claude-sonnet-4-6'), // routes through prxy.monster }); const result = await researcher.generate('What is composable middleware?');

If you want to be explicit (or you’re using createAnthropic/createOpenAI factories):

import { Agent } from '@mastra/core'; import { createAnthropic } from '@ai-sdk/anthropic'; const anthropic = createAnthropic({ baseURL: 'https://api.prxy.monster', apiKey: process.env.ANTHROPIC_API_KEY, }); const researcher = new Agent({ name: 'researcher', instructions: '...', model: anthropic('claude-sonnet-4-6'), });

Verify

curl https://api.prxy.monster/health

Run any agent — successful response confirms routing.

What you get

  • MCP optimization — Mastra agents that use MCP tools see irrelevant tool defs pruned per call.
  • Pattern memory — successful agent solutions get learned across runs.
  • Semantic cache — repeated agent queries return cached answers.
  • Infinite context — long-running multi-step agent runs stop hitting the wall.

For Mastra agents with tool use:

PRXY_PIPE=mcp-optimizer,semantic-cache,patterns,ipc

For high-throughput / cost-sensitive agents:

PRXY_PIPE=exact-cache,semantic-cache,cost-guard,patterns

Mastra workflows

Mastra workflows orchestrate multiple agent + tool steps. Each step that calls an LLM goes through prxy.monster — the workflow engine stays out of the loop.

Mastra memory

Mastra has its own memory primitives. They live inside the agent runtime. prxy.monster lives between the agent runtime and the LLM. They don’t conflict — Mastra memory handles agent-local state, prxy.monster handles cross-agent cache + cross-session patterns + token budget.

Common issues

  • Multi-provider agents — Mastra agents can mix providers. Set both ANTHROPIC_BASE_URL and OPENAI_BASE_URL so the right routing kicks in regardless of which provider any given step uses.
  • Streamingagent.stream(...) works identically.

Full example

Adapt examples/nextjs-vercel-ai  — replace the direct streamText({ model: anthropic('...') }) with a Mastra Agent instance using the same model wiring.

Mastra ships rapidly. Verify the agent constructor signature with the Mastra docs  for your installed version. The model-pickup behavior is inherited from the Vercel AI SDK and is stable.

Last updated on