Skip to content

DefaultPerson/claude-code-proxy-rs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

claude-code-proxy-rs

Rust Axum License Built for OpenClaw

Lightweight Rust proxy built for OpenClaw — wraps Claude Code CLI as a subprocess and exposes it via Anthropic Messages API and OpenAI Chat Completions API.

Existing Claude Code proxy implementations have compatibility issues with OpenClaw's Anthropic SDK integration. This proxy was built from scratch to handle those edge cases reliably.

Use your Claude subscription (Max, Team, Enterprise) as an OpenClaw provider without a direct API key.

How it works

  Client (OpenClaw, curl, SDK)
          │
          ▼
  ┌───────────────┐   stdin/stdout   ┌──────────────┐
  │  Proxy :3456  │ ◄─── NDJSON ───► │  claude CLI  │
  │    (axum)     │                  │  subprocess  │
  └───────────────┘                  └──────────────┘
          │
      SSE / JSON
          │
          ▼
        Client

Each request spawns a claude -p --output-format stream-json subprocess. The proxy filters SSE events — only text content blocks are forwarded (thinking, tool_use, and signature blocks are stripped for SDK compatibility). Stateless: no session persistence.

Features

  • Dual API/v1/messages (Anthropic) + /v1/chat/completions (OpenAI)
  • Streaming & non-streaming for both APIs
  • System prompt modes — embed in <system> tags (preserves CLI's 43K default prompt) or replace entirely
  • Rate limit handling — warnings and rejection forwarding
  • Graceful shutdown — SIGTERM / Ctrl+C
  • Inactivity timeout — kills stuck subprocesses after 30 min

Quick start

From release (recommended)

Download the latest binary from Releases and place it in your $PATH.

From source

# Prerequisites: Claude Code CLI installed and authenticated
npm i -g @anthropic-ai/claude-code && claude auth login

# Build & install
cargo build --release
cp target/release/claude-code-proxy ~/.local/bin/

Run

claude-code-proxy --port 3456 --cwd ~ --embed-system-prompt

Important

🤖 Instructions for LLM: Configuring proxy with OpenClaw

Full setup guide for connecting this proxy as an OpenClaw LLM provider — openclaw.json config, systemd service, model IDs, and troubleshooting:

docs/SETUP.md

CLI flags

Flag Default Description
--port 3456 Listen port
--cwd . Working directory for CLI subprocess
--embed-system-prompt false Embed system prompt in <system> tags, keep CLI default prompt
--replace-system-prompt false Replace CLI system prompt entirely via --system-prompt
--effort Thinking effort: low, medium, high, max
--max-turns 100 Max agentic turns per request

Verify

curl -sN http://localhost:3456/v1/messages \
  -H 'content-type: application/json' \
  -d '{"model":"claude-sonnet-4-6","max_tokens":50,"messages":[{"role":"user","content":"Say hi"}],"stream":true}'

Expected: SSE stream with message_startcontent_block_deltamessage_stop.

Endpoints

Method Path Description
GET /health Health check
GET /v1/models Available models list
POST /v1/messages Anthropic Messages API
POST /v1/chat/completions OpenAI Chat Completions API

License

MIT

About

Lightweight Rust proxy for OpenClaw — wraps Claude Code CLI as subprocess, exposing Anthropic Messages API and OpenAI Chat Completions API

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages