fix(anthropic): support built-in server tools and improve count_tokens response#106
fix(anthropic): support built-in server tools and improve count_tokens response#106juslintek wants to merge 6 commits intojwadow:mainfrom
Conversation
The prompt_tokens reported to clients (used by Claude Code's /context command) were wildly inaccurate because they were derived from Kiro's contextUsagePercentage, which returns unreliable values. Instead, count tokens from the complete serialized Kiro request payload using tiktoken. This includes system prompt, messages, tools, and all other payload fields — matching what actually gets sent to the API. - Replace request_messages/request_tools params with pre-counted prompt_tokens across all streaming functions - Count tokens from full kiro_request_body in both OpenAI and Anthropic route handlers - Remove dependency on contextUsagePercentage for token counting - Update tests to match new function signatures
Claude Code calls this endpoint before each request to check conversation size and decide whether to trigger compaction. Without it, the gateway returns 404, Claude Code cannot estimate context usage, and long conversations eventually hit the upstream CONTENT_LENGTH_EXCEEDS_THRESHOLD error (400). The endpoint builds the full Kiro payload and counts tokens on the serialized JSON using tiktoken, consistent with the token counting approach used in the messages endpoint. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace validate_tool_names (400 error) with deterministic truncation for tool names exceeding 64-char Kiro API limit. Names are shortened to 55 chars + '_' + 8-char md5 hash. Mapping is reversed in responses so clients receive original names. Fixes MCP plugins with auto-generated names like mcp__plugin_cloudflare_cloudflare-docs__search_cloudflare_documentation
…s endpoint Claude Code sends /v1/messages/count_tokens without max_tokens since it only needs token counting, not generation. The required max_tokens field caused 422 validation errors, breaking context usage tracking and preventing conversation compaction.
…s response - Make AnthropicTool.input_schema optional to accept Anthropic built-in server tools (web_search, code_execution, bash, text_editor) that don't have input_schema. These were causing 422 validation errors. - Silently strip server tools in converter since Kiro API can't handle them, while keeping custom tools working as before. - Add context_management.original_input_tokens to count_tokens response to match Anthropic API spec.
|
Thanks for the PR! 🎉 Before merge, we need a one-time CLA confirmation. Full CLA text: Please reply once with: You need to write once, all further messages from me can be ignored. |
|
I guess this is not a good solution for problem 1. It just ignore the problem. Kiro IDE can use the web search tool, so there is a way to use it as api. |
|
Good point @Met4physics. You're right that Kiro IDE supports web search internally. The issue is that the Kiro API uses If we pass them as-is, the Kiro API rejects them with a 422. If we fabricate an Stripping is the safe default to prevent 422 errors. But I agree this could be improved:
Happy to iterate on this if there's a known way to trigger Kiro's web search via the API. |
- Fix server tool detection: check 'input_schema is None' not 'not input_schema'
(empty dict {} is valid for tools with no parameters)
- Update anthropic_to_kiro tests to use .payload on KiroPayloadResult
- Update build_kiro_payload tests to use .payload on KiroPayloadResult
- Replace validate_tool_names tests with truncate_tool_names tests
- Update AnthropicTool tests for optional name/input_schema
- Update max_tokens test for optional default (4096)
All 1412 tests pass.
|
Thanks for the PR! 🎉 Before merge, we need a one-time CLA confirmation. Full CLA text: Please reply once with: You need to write once, all further messages from me can be ignored. |
|
I have read the CLA and I accept its terms |
Problems
1. Built-in server tools cause 422 errors (25 occurrences in logs)
Anthropic's built-in server tools (
web_search_20250305,code_execution_20250522,bash_20241022,text_editor_20250124, etc.) don't haveinput_schema— they use atypefield instead. The gateway'sAnthropicToolmodel requiredinput_schema, causing:2. count_tokens response missing
context_managementThe Anthropic API returns
{"input_tokens": N, "context_management": {"original_input_tokens": N}}but the gateway only returned{"input_tokens": N}. Clients expecting the full response format may break.Solution
AnthropicTool.input_schemaoptional and addtypefield withextra="allow"to accept any built-in tool fieldsinput_schema) in the converter since Kiro API can't handle them — custom tools work as beforecontext_management.original_input_tokensto count_tokens responseBackward compatible
input_schemawork identicallyDepends on