Summary
The Codex Claude Code plugin's app-server identifies itself as clientInfo.name = "Claude Code" during the initialize handshake. The OpenAI Codex backend appears to gate access to newer models (currently gpt-5.5) by client name, and rejects requests from this identifier with a misleading version-related error:
{"type":"error","status":400,"error":{"type":"invalid_request_error",
"message":"The 'gpt-5.5' model requires a newer version of Codex. Please upgrade to the latest app or CLI and try again."}}
The codex CLI itself (codex exec -m gpt-5.5) works fine on the same machine — it identifies as codex_exec. So this is not a CLI version issue; the user's CLI is current.
Reproduction
- codex-cli:
0.126.0-alpha.10
- codex plugin:
1.0.4
- auth: ChatGPT Pro
# Works:
codex exec -m gpt-5.5 "say hi"
# Fails with the error above:
node ~/.claude/plugins/cache/openai-codex/codex/1.0.4/scripts/codex-companion.mjs task --model gpt-5.5 "say hi"
`/codex:rescue`, `/codex:review`, and `/codex:adversarial-review` all fail the same way when invoked with `gpt-5.5`, since they share the app-server path.
Root cause
`plugins/codex/scripts/lib/app-server.mjs` hardcodes:
```js
const DEFAULT_CLIENT_INFO = {
title: "Codex Plugin",
name: "Claude Code",
version: PLUGIN_MANIFEST.version ?? "0.0.0"
};
```
The codex tracing logs confirm requests go out as `app_server.client_name="Claude Code"` from the plugin vs `app_server.client_name="codex_exec"` from `codex exec`. Patching `name` to `"codex_cli"` locally makes the same `gpt-5.5` request succeed end-to-end (turn/completed → assistant reply).
Suggested fix
Change `DEFAULT_CLIENT_INFO.name` to a `codex_*`-prefixed identifier (e.g. `"codex_claude_code_plugin"` or `"codex_cli"`) so the plugin inherits the same model-gating allowlist as the CLI. Alternatively, OpenAI could whitelist `"Claude Code"` server-side.
Happy to send a PR if useful.
Summary
The Codex Claude Code plugin's app-server identifies itself as
clientInfo.name = "Claude Code"during theinitializehandshake. The OpenAI Codex backend appears to gate access to newer models (currentlygpt-5.5) by client name, and rejects requests from this identifier with a misleading version-related error:The codex CLI itself (
codex exec -m gpt-5.5) works fine on the same machine — it identifies ascodex_exec. So this is not a CLI version issue; the user's CLI is current.Reproduction
0.126.0-alpha.101.0.4`/codex:rescue`, `/codex:review`, and `/codex:adversarial-review` all fail the same way when invoked with `gpt-5.5`, since they share the app-server path.
Root cause
`plugins/codex/scripts/lib/app-server.mjs` hardcodes:
```js
const DEFAULT_CLIENT_INFO = {
title: "Codex Plugin",
name: "Claude Code",
version: PLUGIN_MANIFEST.version ?? "0.0.0"
};
```
The codex tracing logs confirm requests go out as `app_server.client_name="Claude Code"` from the plugin vs `app_server.client_name="codex_exec"` from `codex exec`. Patching `name` to `"codex_cli"` locally makes the same `gpt-5.5` request succeed end-to-end (turn/completed → assistant reply).
Suggested fix
Change `DEFAULT_CLIENT_INFO.name` to a `codex_*`-prefixed identifier (e.g. `"codex_claude_code_plugin"` or `"codex_cli"`) so the plugin inherits the same model-gating allowlist as the CLI. Alternatively, OpenAI could whitelist `"Claude Code"` server-side.
Happy to send a PR if useful.