Skip to content

TypeError: int() argument must be a string ... not 'NoneType' in _accumulate_stream_items when streaming tool calls from Gemini #3697

@GDegrove

Description

@GDegrove

Bug Description

When streaming chat completions with tool calls from Google's OpenAI-compatible Gemini API, the instrumentor crashes with an unhandled TypeError that propagates up and kills the response stream:

File "opentelemetry/instrumentation/openai/shared/chat_wrappers.py", line 1176, in _accumulate_stream_items
    i = int(tool_call["index"])
TypeError: int() argument must be a string, a bytes-like object or a real number, not 'NoneType'

Root Cause

Gemini's OpenAI-compatible API returns ndex: null on streaming tool call delta chunks. _accumulate_stream_items unconditionally calls int(tool_call["index"]) with no guard for None.

# chat_wrappers.py line 1176 — crashes when index is None
i = int(tool_call["index"])

Why Non-Streaming Is Not Affected
Two reasons:

  1. _handle_response is decorated with @dont_throw — any exception is silently caught and logged
    _set_completions iterates tool calls using enumerate() and never accesses tool_call["index"]

For streaming, _process_item has no @dont_throw decorator, so the TypeError propagates all the way up through the Starlette response stream and crashes the request.

Suggested Fix
One-line fix on line 1176:

# Before
i = int(tool_call["index"])

# After
i = int(tool_call["index"] or 0)

Or, for consistency with the non-streaming code path, add @dont_throw to _process_item.

Reproduction

from openai import AsyncOpenAI
from opentelemetry.instrumentation.openai import OpenAIInstrumentor

OpenAIInstrumentor().instrument()

client = AsyncOpenAI(
    api_key="<GEMINI_API_KEY>",
    base_url="https://generativelanguage.googleapis.com/v1beta/openai/",
)

stream = await client.chat.completions.create(
    model="gemini-2.5-flash",
    stream=True,
    messages=[{"role": "user", "content": "What's the weather in Paris?"}],
    tools=[{
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get the current weather",
            "parameters": {
                "type": "object",
                "properties": {"location": {"type": "string"}},
                "required": ["location"],
            },
        },
    }],
)

async for chunk in stream:  # ← crashes here
    print(chunk)

Environment

  • opentelemetry-instrumentation-openai==0.52.3 (latest)
  • OpenAI Python SDK v1
  • Provider: Google Gemini via OpenAI-compatible endpoint (generativelanguage.googleapis.com/v1beta/openai/)
  • Python 3.14

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions