Route Claude Code Through a Team Proxy with ANTHROPIC_BASE_URL
Teams often need to route AI traffic through a central gateway for cost tracking, usage limits, or compliance. Set ANTHROPIC_BASE_URL to point Claude Code at any Anthropic-compatible proxy — no code changes required.
export ANTHROPIC_BASE_URL=https://ai-gateway.yourcompany.com/anthropic
export ANTHROPIC_API_KEY=your-team-issued-key
Add these to your shell profile or a .env sourced by your team's setup script, and every developer's Claude Code session will route through the gateway automatically. This works with popular gateways including LiteLLM, Portkey, and AWS Bedrock proxies.
It's also useful locally when you want to inspect exactly what Claude Code sends and receives:
# Run a local proxy to inspect traffic
litellm --model anthropic/claude-sonnet-4-5 --port 8080
export ANTHROPIC_BASE_URL=http://localhost:8080
claude # All requests now flow through LiteLLM
Pair this with per-user API keys issued by the gateway to get per-developer cost breakdowns without sharing the master key. The gateway can also enforce rate limits or add request logging your security team requires.
One environment variable, and your whole team's usage flows through the gateway you control.
Log in to leave a comment.
The autoUpdatesChannel setting pins Claude Code to a stable release track that skips versions with major regressions.
The language setting makes Claude respond in your preferred language by default, across every session and project.
The attribution setting lets you customize or completely remove Claude's Co-Authored-By trailer from git commits and pull requests.