You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OpenAi's o3-mini doesn't accept max_tokens but max_completion_tokens instead. When running:
openai_o3_mini=OpenAIServerModel(model_id="o3-mini", api_key=OPENAI_API_KEY)
agent=CodeAgent(tools=[DuckDuckGoSearchTool()], model=chatgpt_4o_mini)
agent.run("What is the weather in Tokyo?")
I get:
...
Error in generating model output:
Error code: 400 - {'error': {'message': "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.", 'type': 'invalid_request_error', 'param': 'max_tokens', 'code':
'unsupported_parameter'}}
This is with smolagents version = "1.7.0".
The temporary solution I found is to delete such default param from kwargs:
OpenAi's o3-mini doesn't accept
max_tokens
butmax_completion_tokens
instead. When running:I get:
This is with smolagents
version = "1.7.0"
.The temporary solution I found is to delete such default param from kwargs:
That way it works. I believe this will be a problem with all reasoning models like o1 and o1-mini.
The text was updated successfully, but these errors were encountered: