Using prxy.monster with Aider
Aider is a terminal-based coding assistant that pairs with git. It supports custom OpenAI-compatible endpoints natively via CLI flags or environment variables.
Configure
Two ways — CLI flags or env vars. Pick one.
Option A: CLI flags
aider \
--openai-api-base https://api.prxy.monster/v1 \
--openai-api-key prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx \
--model gpt-4oFor Anthropic models:
aider \
--anthropic-api-base https://api.prxy.monster \
--anthropic-api-key prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx \
--model claude-sonnet-4-6Option B: Env vars
export OPENAI_API_BASE=https://api.prxy.monster/v1
export OPENAI_API_KEY=prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx
# OR for Anthropic
export ANTHROPIC_API_BASE=https://api.prxy.monster
export ANTHROPIC_API_KEY=prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx
aider --model claude-sonnet-4-6Option C: Aider config file
Put it in ~/.aider.conf.yml:
openai-api-base: https://api.prxy.monster/v1
openai-api-key: prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx
model: gpt-4oCode change
None — aider is a CLI; nothing to modify.
Verify
curl https://api.prxy.monster/healthRun aider and ask any question — successful response confirms routing.
What you get
Aider workflows are highly repetitive across a project — same files, same coding patterns, same fix categories. This is ideal for prxy.monster:
- Pattern memory — aider’s typical task (“fix this bug”, “add a test”, “refactor this function”) becomes more efficient as patterns accumulate.
- Semantic cache — re-asking similar questions across sessions returns cached answers.
- Cost guards — hard cap on per-day aider spend (aider sessions can rack up surprisingly).
- Infinite context — long aider sessions on big repos stop hitting the wall via
ipc.
Recommended pipeline
PRXY_PIPE=semantic-cache,patterns,ipc,cost-guardFor high-volume aider users who want repeat-prompt savings:
PRXY_PIPE=exact-cache,semantic-cache,patterns,cost-guardStreaming
Aider streams output by default. Pass-through.
Multi-model aider
Aider supports separate models for “main” and “weak” (autocomplete-style) tasks:
aider --model claude-sonnet-4-6 --weak-model claude-haiku-4-5 \
--openai-api-base https://api.prxy.monster/v1 \
--openai-api-key prxy_live_xxxBoth models route through prxy.monster.
Common issues
--no-stream— works fine. Cache hits return as a single non-streamed response in this mode.- Git integration — aider’s git operations are client-side. prxy.monster sees only the LLM calls.
- Context size errors — same as Cline; if aider’s diffs balloon the context, the
ipcmodule helps but can’t compress an explicit user-supplied diff. Use a model with a large enough window. /voicemode (Whisper) — aider sends audio directly to OpenAI’s Whisper endpoint. This currently bypasses prxy.monster (no Whisper proxying yet).
Full example
Adapt examples/openai-quickstart — but for aider, the integration is purely CLI flags. No example app needed.
Verify CLI flag names against aider --help for your installed version. The --openai-api-base / --anthropic-api-base / --openai-api-key / --anthropic-api-key flags have been stable across many releases.