How to Use GPT-5.3 Codex with Claude Code, OpenCode, OpenClaw, and Codex CLI
Use OpenAI's GPT-5.3 Codex with coding agents built for the Anthropic API — Claude Code, OpenCode, OpenClaw, Codex CLI, and any Anthropic SDK. Smart AIPI translates between API formats automatically.
TL;DR: Set ANTHROPIC_API_KEY to your Smart AIPI key and ANTHROPIC_BASE_URL to https://api.smartaipi.com. Claude Code, OpenCode, OpenClaw, Codex CLI, and any Anthropic SDK will use GPT-5.3 Codex on the backend — 75% cheaper, automatic format translation.
Smart AIPI lets you use OpenAI models like GPT-5.3 Codex with any tool built for the Anthropic Claude API — including Claude Code, OpenCode, OpenClaw, Codex CLI, and the Anthropic Python/Node SDKs. It translates between API formats automatically so you don't need to change any code.
Why Would I Need This?
Many popular coding agents and developer tools are built for the Anthropic API format:
- Claude Code — Anthropic's CLI coding agent
- OpenCode — Open-source terminal-based AI coding assistant
- OpenClaw — AI coding agent with Anthropic API support
- Codex CLI — OpenAI's coding agent (supports Anthropic API format)
- Anthropic SDKs — Python and Node.js libraries
These tools expect the Anthropic /v1/messages endpoint. If you want to use OpenAI models with them, you'd normally need to write a translation layer. Smart AIPI handles this automatically.
How Do I Set It Up?
Set two environment variables. Most Anthropic-compatible tools read these automatically:
export ANTHROPIC_API_KEY=sk-proj-your-smart-aipi-key
export ANTHROPIC_BASE_URL=https://api.smartaipi.com
That's it. Your tool sends Anthropic-format requests to Smart AIPI, which translates them to OpenAI format, gets the response from GPT-5.3 Codex, and translates it back.
Python (Anthropic SDK)
from anthropic import Anthropic
client = Anthropic(
api_key="sk-proj-your-smart-aipi-key",
base_url="https://api.smartaipi.com"
)
# This hits GPT-5.3 Codex on the backend
response = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}]
)
Node.js (Anthropic SDK)
import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic({
apiKey: 'sk-proj-your-smart-aipi-key',
baseURL: 'https://api.smartaipi.com',
});
const response = await client.messages.create({
model: 'claude-sonnet-4-5-20250929',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello!' }],
});
What Exactly Gets Translated?
Smart AIPI handles the full round-trip translation between Anthropic and OpenAI API formats:
- Request format — Anthropic-style
system+messagesconverted to OpenAI's format - Response format — OpenAI responses converted back to Anthropic's
content_blockstructure - Streaming — Server-sent events translated between both formats in real time
- Model mapping — Claude model names automatically routed to the best matching OpenAI model
Which Models Map to What?
When you request a Claude model name, Smart AIPI routes it to the equivalent OpenAI model:
| You Request | Smart AIPI Routes To | Cost / 1M Output |
|---|---|---|
| claude-opus-4-6 | GPT-5.3 Codex | $3.50 |
| claude-sonnet-4-5 | GPT-5.3 Codex | $3.50 |
| claude-haiku-4-5 | Codex Mini | $0.15 |
Frequently Asked Questions
Can I use Claude Code with Smart AIPI?
Yes. Set ANTHROPIC_API_KEY and ANTHROPIC_BASE_URL to point to Smart AIPI, and Claude Code will work using GPT-5.3 Codex on the backend.
Does streaming work through the Anthropic endpoint?
Yes. Smart AIPI translates OpenAI's streaming format to Anthropic's SSE format in real time. No changes needed in your code.
Is the response quality the same as calling OpenAI directly?
Yes. Smart AIPI routes to the same OpenAI models. The only difference is the API format translation and the 75% lower price.
Can I also use the OpenAI format with the same API key?
Yes. Smart AIPI supports both /v1/chat/completions (OpenAI) and /v1/messages (Anthropic) with the same API key. Use whichever format your tool needs.
OpenAI-compatible API gateway. Access frontier AI models at 75% less cost.
Start for free