Every AI coding tool wants its own API key. Its own config. Its own account.
Codex CLI wants OpenAI. Claude Code wants Anthropic. Gemini CLI wants Google. And you're sitting there with 5 browser tabs of API dashboards open, copy-pasting keys like it's 2019.
What if your tools never had to know where their tokens come from?
The Problem Nobody Talks About
You've got 3 ChatGPT accounts (don't pretend you don't). Two API keys. Maybe a Claude account. And every time you switch tools, you're manually rewiring the plumbing.
Worse — when one account hits its rate limit at 2 AM during a debug session, you're done. Go to bed. Try tomorrow.
That's not how this should work.
ProxyPool Hub: Your AI Traffic Controller
ProxyPool Hub is an open-source local proxy that sits between your AI tools and their APIs. Every tool points to localhost:8081, and the proxy figures out the rest.
npx proxypool-hub@latest start
One command. Dashboard at localhost:8081. Done.
But the latest release isn't just about pooling accounts anymore. It's about intelligent routing.
What's New: The Smart Routing Update
Bind Apps to Specific Credentials
This is the big one.
You can now tell ProxyPool Hub: "Always route Codex through my Azure OpenAI key. Always route Claude Code through my Claude account. Route everything else through the pool."
It's called App Assignments — and it changes how you think about API credentials.
- Codex CLI → Azure OpenAI key (your fastest endpoint)
- Claude Code → Claude account (your Pro subscription)
- Gemini CLI → Gemini API key (your free tier)
- OpenClaw → Whatever's available
Each app can have multiple fallback bindings. If binding #1 is rate-limited, it tries #2. If all fail and you've enabled fallback, it drops back to automatic rotation.
No more "wrong key for wrong tool" accidents.
Model Mapping That Actually Makes Sense
Instead of memorizing that gpt-4o should map to gemini-2.5-pro on Google or claude-sonnet-4-20250514 on Anthropic, ProxyPool Hub introduces tier-based mapping.
Four tiers: flagship, standard, fast, reasoning.
When Codex asks for gpt-4o, the proxy recognizes it as a "standard" tier model and maps it to whatever you've configured as your standard model on the target provider. New models auto-classify based on their names — no manual updates needed.
Codex Can See Images Now
This one's subtle but important. Codex CLI can read image files as part of its tool workflow. But Azure OpenAI uses a different image format (image_url) than what Codex sends (input_image).
ProxyPool Hub now transparently converts between all image formats — base64, URLs, data URIs — across all providers. Codex vision tasks just work, regardless of which backend you're routing through.
Free Model Toggle
Not every request needs GPT-4. The new Free Models switch routes lightweight requests (Haiku-tier) to free providers automatically. Turn it off when you need guaranteed quality. Turn it on to save money on code completions and quick lookups.
Built-in Chat
Sometimes you just want to test if your credentials work. The new dashboard chat lets you pick any credential source and start a streaming conversation — right in the browser. No CLI needed.
The Architecture (It's Simpler Than You Think)
Your AI Tools (Codex, Claude Code, Gemini CLI, OpenClaw)
|
localhost:8081
|
+----------+----------+
| | |
App Router Model Map Free Model
| | Routing
+----------+----------+
|
+--------+------+------+--------+
| | | | |
ChatGPT Claude Azure Gemini Vertex
Accounts Accounts OpenAI API AI
Every request gets:
- Detected — which app sent it
- Routed — to the assigned credential (or the pool)
- Mapped — model name translated to the target provider
- Translated — protocol converted (OpenAI ↔ Anthropic ↔ Gemini)
- Delivered — with automatic retry on rate limits
It's 100% Local
- Runs on
localhost— nothing leaves your machine - Direct connections to official APIs — no relay servers
- Zero telemetry, zero tracking
- Credentials stored locally with restricted permissions
Get Started in 30 Seconds
# No install needed
npx proxypool-hub@latest start
# Or install globally
npm install -g proxypool-hub
proxypool-hub start
Open http://localhost:8081. Add your accounts or API keys. Click "Configure" next to your favorite CLI tool. Start coding.
Links
- GitHub: github.com/yiyao-ai/proxypool-hub
- npm: npmjs.com/package/proxypool-hub
- Discord: Join the community
If this saves you even one "wrong API key" headache, drop a star on GitHub. It helps other developers find the project.
ProxyPool Hub is open-source under AGPL-3.0. Not affiliated with Anthropic, OpenAI, or Google.

