Skip to content

MCP server that wraps the Claude Code and Gemini CLI so any MCP client (e.g. codex) can call it as a tool.

Notifications You must be signed in to change notification settings

thomaswitt/mcp-agents

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

mcp-agents

MCP server that wraps AI CLI tools — Claude Code, Gemini CLI, and Codex CLI — so any MCP client can call them as tools.

Prerequisites

  • Node.js >= 18
  • At least one of the following CLIs installed and on your $PATH:
CLI Install
claude Claude Code docs
gemini npm install -g @anthropic-ai/gemini-cli
codex npm install -g @openai/codex

Only the CLI you select with --provider needs to be present.

Quick test

# Default provider (codex)
npx mcp-agents

# Specific provider
npx mcp-agents --provider claude
npx mcp-agents --provider gemini

The server speaks JSON-RPC over stdio. It prints [mcp-agents] ready (provider: <name>) to stderr when it's listening.

Providers & Tools

Each --provider flag maps to a single exposed tool:

Provider Tool name CLI command
claude claude_code claude -p <prompt>
gemini gemini gemini [-s] -p <prompt>
codex (pass-through) codex mcp-server

claude_code parameters

Parameter Type Required Description
prompt string yes The prompt to send to Claude Code
timeout_ms integer no Timeout in ms (default: 120 000)

gemini parameters

Parameter Type Required Description
prompt string yes The prompt to send to Gemini CLI
sandbox boolean no Run in sandbox mode (-s flag)
timeout_ms integer no Timeout in ms (default: 120 000)

codex (pass-through)

The codex provider passes through to Codex's native MCP server (codex mcp-server) with configurable flags:

CLI Flag Default Codex flag
--model gpt-5.2-codex -m <model>
--model_reasoning_effort high -c model_reasoning_effort=<value>

Hardcoded defaults: -s read-only -a never (safe for MCP server mode).

Integration with Claude Code

Add entries to your project's .mcp.json:

{
  "mcpServers": {
    "codex": {
      "command": "npx",
      "args": ["-y", "mcp-agents@latest", "--provider", "codex"]
    },
    "gemini": {
      "command": "npx",
      "args": ["-y", "mcp-agents@latest", "--provider", "gemini"]
    }
  }
}

Override codex defaults:

{
  "mcpServers": {
    "codex": {
      "command": "npx",
      "args": ["-y", "mcp-agents@latest", "--provider", "codex", "--model", "o3-pro", "--model_reasoning_effort", "medium"]
    }
  }
}

Integration with OpenAI Codex

Add two entries to ~/.codex/config.toml — one per provider you want available:

[mcp_servers.claude-code]
command = "npx"
args = ["-y", "mcp-agents", "--provider", "claude"]

[mcp_servers.gemini]
command = "npx"
args = ["-y", "mcp-agents", "--provider", "gemini"]

Then in a Codex session you can call the claude_code or gemini tools, which shell out to the respective CLIs.

How it works

  1. An MCP client connects over stdio
  2. The server reads --provider <name> from its argv (defaults to codex)
  3. It registers a single tool matching that provider's CLI
  4. Client calls tools/call with the tool name and a prompt
  5. The server runs the CLI as a child process and returns stdout (or stderr) as the tool result

The server includes a keepalive timer to prevent Node.js from exiting prematurely when stdin reaches EOF before the async subprocess registers an active handle.

License

MIT

About

MCP server that wraps the Claude Code and Gemini CLI so any MCP client (e.g. codex) can call it as a tool.

Resources

Stars

Watchers

Forks