Skip to Content
prxy.monster v1 is in early access. See what shipped →
RecipesRecipe — Privacy mode (local)

Privacy mode (local)

For workloads where data residency or privacy compliance rules out a hosted gateway. Everything runs in a Docker container on your own hardware. The only outbound traffic is the LLM call itself, to the provider you choose.

This recipe assumes local mode. Cloud mode by definition involves a hosted service.

What this pipeline gives you

  • All caching, learning, and context management happens in a SQLite file on your disk.
  • (v1.1) airgap enforces that no other network call can be made by the gateway.
  • (v1.1) guardrails redacts PII before the request leaves your machine for the provider.

The pipeline

docker run -d \ -p 3099:3099 \ -v ~/.prxy:/data \ -e ANTHROPIC_API_KEY=sk-ant-xxx \ -e PRXY_PIPE='ipc,patterns,semantic-cache' \ prxymonster/local:latest

When v1.1 ships:

PRXY_PIPE='airgap,guardrails,ipc,patterns,semantic-cache'

Why this order

  1. (v1.1) airgap first — enforces no-network-out at the start of every request. Other modules then run inside the network sandbox.
  2. (v1.1) guardrails — redacts PII before anything else touches the prompt.
  3. ipc — manages context length using only local storage.
  4. patterns — your forged patterns stay in ~/.prxy/prxy.db. Never sync, never share.
  5. semantic-cache — embeddings + cached responses live locally.

What’s stored on disk

~/.prxy/ ├── prxy.db ← SQLite. Patterns, cached embeddings + responses, sessions. ├── blob/ ← Archived (compressed-out) message bodies. └── config.yaml ← Optional pipeline config.

The container has no other state. Restart it — everything persists. Delete ~/.prxy/ — everything is gone.

What leaves your machine

The LLM API call itself. Nothing else.

The gateway makes outbound calls only to:

  • The configured provider (Anthropic, OpenAI, Google, Groq) — over HTTPS.
  • Optionally: an embedding provider (Voyage / OpenAI) for semantic-cache’s vector lookups. Disable by leaving VOYAGE_API_KEY and OPENAI_API_KEY unset for embeddings — mcp-optimizer and semantic-cache fall back to the offline stub embedder.

No telemetry. No phone-home. No background sync. The container’s network egress can be audited with any standard packet capture tool.

Hardening checklist

  • No OPENAI_API_KEY set unless you actually want OpenAI to receive your prompts.
  • VOYAGE_API_KEY unset → falls back to offline stub embeddings.
  • (v1.1) airgap module added → enforces zero non-provider network traffic at the gateway layer.
  • Bind the container to localhost only: -p 127.0.0.1:3099:3099.
  • Ship the ~/.prxy/ directory to encrypted backups; nowhere else.
  • Audit outbound traffic via tcpdump or your firewall after first-run sanity test.

See also

Last updated on