Skip to content

Conversation

@dpolistwm
Copy link

@dpolistwm dpolistwm commented Nov 21, 2025

Description

This PR adds support for preserving Gemini's thoughtSignature field during function calling, which is required by Gemini 3 Pro for multi-turn conversations with tools.

Problem

When using Gemini 3 Pro with function calling, the model returns a thought_signature field that must be passed back in subsequent requests. Currently, Strands drops this field during streaming and message reconstruction, causing 400 errors:
Unable to submit request because function call tool_name in the 2. content block is missing a thought_signature

This affects multi-turn conversations, nested agents, and any workflow requiring multiple function call exchanges with Gemini 3 Pro.

Solution

Framework Changes:

  1. src/strands/types/tools.py: Added thoughtSignature: NotRequired[str] to ToolUse TypedDict (changed to total=False to support optional field)
  2. src/strands/types/content.py: Added thoughtSignature: NotRequired[str] to ContentBlockStartToolUse TypedDict and added NotRequired import
  3. src/strands/event_loop/streaming.py: Preserve thoughtSignature when processing streaming chunks (2 locations: extracting from tool use data and passing through to ToolUse object creation)

Gemini Provider Changes:
4. src/strands/models/gemini.py:

  • Capture thought_signature from Gemini function call responses
  • Base64 encode for storage in message history
  • Decode and pass back to Gemini in subsequent requests
  • Configure thinking_config to disable thinking text while preserving signatures

Technical Details

The thoughtSignature is an encrypted token provided by Gemini that preserves the model's reasoning context. It's:

  • Returned as bytes from the Gemini API
  • Stored as Base64-encoded string in message history (for JSON serialization)
  • Decoded back to bytes when sending subsequent requests
  • Optional field (NotRequired[str]) that doesn't affect other providers

Reference: Gemini Thought Signatures Documentation

Related Issues

None - this is a new feature to support Gemini 3 Pro's function calling requirements.

Documentation PR

N/A - No documentation changes needed (inline code comments added for clarity)

Type of Change

New feature

Testing

How have you tested the change?

Tested with:

  • ✅ Gemini 3 Pro (gemini-3-pro-preview) with function calling
  • ✅ Multi-turn conversations with nested agents
  • ✅ Sequential and parallel function calls
  • ✅ Backward compatibility verified with Bedrock and OpenAI providers

Results:

  • ✅ No more 400 errors about missing thought_signature
  • ✅ Function calling works correctly across multiple turns
  • ✅ Nested agents (agents calling other agents) work properly
  • ✅ Optional field doesn't affect models that don't use it
  • ✅ No breaking changes to existing functionality

Verified in consuming repositories:

  • agents-tools: ✅ No warnings introduced
  • Existing Bedrock/OpenAI workflows: ✅ Continue working unchanged

Note on CI: There is 1 pre-existing mypy error in src/strands/tools/tools.py:47 (unused type: ignore comment) that is unrelated to these changes. All type checking passes for the files modified in this PR.

  • I ran hatch run prepare

Checklist

  • I have read the CONTRIBUTING document
  • I have added any necessary tests that prove my fix is effective or my feature works
  • I have updated the documentation accordingly
  • I have added an appropriate example to the documentation to outline the feature, or no new docs are needed
  • My changes generate no new warnings
  • Any dependent changes have been merged and published

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

Related Issues

Fixes #1199

- Add thoughtSignature field to ToolUse TypedDict as optional field
- Add thoughtSignature to ContentBlockStartToolUse TypedDict
- Preserve thoughtSignature during streaming event processing
- Fixes compatibility with Gemini 3 Pro thinking mode

This change enables proper multi-turn function calling with Gemini 3 Pro,
which requires thought_signature to be passed back in subsequent requests.

Resolves: Gemini 3 Pro 400 error for missing thought_signature
See: https://ai.google.dev/gemini-api/docs/thought-signatures
- Capture thought_signature from Gemini function call responses
- Base64 encode thought_signature for storage in message history
- Decode and pass thought_signature back to Gemini in subsequent requests
- Configure thinking_config to disable thinking text but preserve signatures
- Add NotRequired import to content.py for type safety

This complements the framework changes by implementing Gemini-specific
handling of thought signatures for proper multi-turn function calling
with Gemini 3 Pro.

See: https://ai.google.dev/gemini-api/docs/thought-signatures
- Fix variable name conflict with thought_signature
- Break long lines to comply with 120 character limit
- Use explicit type annotations for thought signature variables
@dpolistwm dpolistwm force-pushed the feat/preserve-thought-signature-gemini branch from ed1837d to a20b7ea Compare November 21, 2025 17:16
@github-actions github-actions bot added size/l and removed size/l labels Nov 21, 2025
@Ratish1
Copy link
Contributor

Ratish1 commented Nov 21, 2025

@dpolistwm I see that you have created two new test files. I dont recommend that, the tools test file already exists here and creating new test files for testing small logic is something the maintainers wouldn't like I think. Make sure you only add tests for the fix you are proposing. The best option I think here would be just waiting for a maintainer to see what approach they would take. Thanks.

- Rename test_tools.py -> test_tool_use.py (tests ToolUse TypedDict)
- Rename test_content.py -> test_content_block_start_tool_use.py (tests ContentBlockStartToolUse TypedDict)

This makes the test file names more descriptive and avoids confusion with tests/strands/tools/test_tools.py
@github-actions github-actions bot added size/l and removed size/l labels Nov 21, 2025
@dpolistwm
Copy link
Author

Hey @mkmeral , I understand your concern, but implementing this workaround and degrading the tought process of such a popular model would make strands agents a "second class citizen" among other frameworks in the market.

Many users may stop using Strands Agents with Gemini 3 Pro (one of the most popular, fast and powerful LLMs in the market) if this is the official solution and probably choose ADK instead. While it does not support AWS Agentcore directly (memory, etc.), it does support Google's equally feature-rich AgentEngine counterpart. We would certainly consider that.

Please take this risk into account when making your decision.

@mkmeral
Copy link
Contributor

mkmeral commented Dec 5, 2025

@dpolistwm actually I think I found a way to get the feature working with your PR, and leave the tool use event type as is.

Gemini 3 Pro only validates the latest thought signature, not all of them. So what we can do instead is keep track of the thought signatures in self.thought_signatures in gemini.py. That way thought signatures would work out of the box, and we would only need to change gemini model provider.

Can you update your PR following this approach?

One other thing to note: we would also still want to apply skip_though_signatures, if no thought signatures exist. That would allow developers to switch the models for the same agent, and still keep them working.

Does this make sense?

@dpolistwm
Copy link
Author

I'll try and make this work ;)

This refactors the thoughtSignature feature to be self-contained within
the GeminiModel class instead of spreading it across the SDK:

Changes:
- Removed thoughtSignature from ContentBlockStartToolUse (content.py)
- Removed thoughtSignature from ToolUse (tools.py)
- Removed thoughtSignature handling from streaming.py
- Added self.last_thought_signature instance attribute to GeminiModel
- GeminiModel now stores thought_signature from responses and uses it
  for subsequent requests automatically

Benefits:
- Cleaner SDK types without Gemini-specific fields
- Simpler implementation without base64 encoding/decoding in messages
- thought_signature is automatically tracked per model instance
- No changes required to SDK-level streaming pipeline

Tests updated accordingly to verify internal storage behavior.
…ini.py

Reverted content.py, tools.py, streaming.py and test files to their
original state. thoughtSignature handling is now self-contained in
GeminiModel using self.last_thought_signature.
@dpolistwm
Copy link
Author

Now the only changed file in this PR is gemini.py with the last thought signature. I tested it locally against a real project and it is working fine 💯

@dpolistwm
Copy link
Author

Working on "0 tools" implementation (getting a 400 error)...

@dpolistwm
Copy link
Author

Looks fine now to me. Please let me know if you need anything else

@Ratish1
Copy link
Contributor

Ratish1 commented Dec 6, 2025

Hello @mkmeral, could my PR also be used in some way, since @dpolistwm mentions gemini 3 doesnt work in my PR, maybe we can figure out a way to fix it. Or if this PR is already correct, I will close my PR then ig.

Returns:
Gemini request config.
"""
# Disable thinking text output when tools are present
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why? Currently developers can configure this right? Do we need to explicitly set it for everyone?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes perfect sense.


# Store the last thought_signature from Gemini responses for multi-turn conversations
# See: https://ai.google.dev/gemini-api/docs/thought-signatures
self.last_thought_signature: Optional[bytes] = None
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we default to skip? This way, if developers change model provider (e.g. anthropic to gemini), the agent would still work

Copy link
Author

@dpolistwm dpolistwm Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is contained within the Gemini model implementation so it doesn't affect other providers. Could you please ellaborate? (the absence of thought signatures is also the root cause of the current incompatibility with Gemini 3 Pro)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually 2 use cases:

First, customers can switch model providers. So imagine they start with Bedrock, and then they want to change the agent's model provider to gemini 3

# Written is semi-psuedocode, don't focus on code too much 😅 
agent = Agent(model=BedrockModel(), tools=[tool1, tool2])

# agent is being used with tools
agent(...) 

# later (for some reason) devs might try to update the model providers to Gemini 3
agent.model = GeminiModel(modelId="3-pro")

# agent is being invoked. At this point, it has the original message history with tool use, but it has a clean gemini model provider. If we do not skip the 
agent(...)

Second, session managers can be problematic. If agent's context is loaded from session managers, the thought signature will not exist. We need to ensure for these cases, we do not throw exception.

# Create a session manager with a unique session ID
session_manager = FileSessionManager(session_id="test-session")
agent = Agent(session_manager=session_manager)
agent("Hello!")  # This conversation is persisted


# Couple hours later, in another runtime instance, running the same code again
session_manager = FileSessionManager(session_id="test-session") # Loads the messages from file system
agent = Agent(session_manager=session_manager)
agent("Hello!")  # Throws error due to though signatures

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From what I understood reading Google's docs, the requirement for the preservation of tought signatures needs to happen within a turn/cycle of the agent loop. So, if a session/memory was created using another model, during the next turn execution with Gemini 3, a first/new thought signature would be introduced and be used. I can try to run both scenarios to try and reproduce.

Copy link
Author

@dpolistwm dpolistwm Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here's a quick script I've used:

#!/usr/bin/env python3
"""Example script using strands agents with FileSystem session manager."""

import os
from uuid import uuid4
from dotenv import load_dotenv

load_dotenv()
from strands import Agent, tool
from strands.models.gemini import GeminiModel
from strands.session import FileSessionManager
from strands.models.bedrock import BedrockModel

@tool
def get_weather(city: str) -> str:
    return f"The weather in {city} is sunny."

@tool
def greet_user(user_name: str) -> str:
    return f"Hello, {user_name}! How can I help you today?"


def main():
    """Main function demonstrating strands agents with FileSessionManager."""
    # Create a storage directory for sessions
    storage_dir = os.path.join(os.getcwd(), "sessions")
    
    # Generate a session ID (or use a fixed one for persistence)
    session_id = str(uuid4())
    print(f"Session ID: {session_id}")
    
    # Create FileSessionManager
    session_manager = FileSessionManager(
        session_id=session_id,
        storage_dir=storage_dir
    )
    
    # Create GeminiModel with gemini-3-pro-preview
    gemini_model = GeminiModel(
        model_id="gemini-3-pro-preview",
    )

    bedrock_model = BedrockModel(
        model_id="us.anthropic.claude-3-7-sonnet-20250219-v1:0",
    )
    
    # Create an agent with the session manager and Bedrock model
    agent = Agent(
        system_prompt="You are a helpful assistant that can answer questions and help with tasks. When the user provides their name, greet them using the greet_user tool.",
        model=bedrock_model, 
        tools=[greet_user, get_weather], 
        session_manager=session_manager,
        agent_id="my_agent"
    )
    
    agent("Hello! My name is Daniel.")

    agent.model = gemini_model
    agent ("What is the weather in Tokyo?")


if __name__ == "__main__":
    main()

Worked fine :) Here's the output:

➜  test-gemini3 uv run python main.py
Session ID: 8721956e-28db-4ee0-abe1-68a39ad64d08

Tool #1: greet_user
Hello Daniel! It's nice to meet you. How can I assist you today? I'm here to help answer questions or assist with tasks you might have.
Tool #2: get_weather
The weather in Tokyo is currently sunny

id=content["toolUse"]["toolUseId"],
name=content["toolUse"]["name"],
),
thought_signature=self.last_thought_signature,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would apply it to all tool uses right? Is there a way we can make it more specific?

Copy link
Author

@dpolistwm dpolistwm Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is required per Gemini docs:

"As a general rule, if you receive a thought signature in a model response, you should pass it back exactly as received when sending the conversation history in the next turn. When using Gemini 3 Pro, you must pass back thought signatures during function calling, otherwise you will get a validation error (4xx status code)."

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I meant more of this line, where we don't actually need to set this for all tool calls, just the first one. That said, this is more of a nit comment

The first functionCall part in each step of the current turn must include its thought_signature

https://ai.google.dev/gemini-api/docs/thought-signatures

Copy link
Author

@dpolistwm dpolistwm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed hard-coded though configuration

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] INVALID_ARGUMENT using gemini-3-pro-preview

3 participants