Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: The latest library of SemanticKernel does not support Azure o1 series models #9165

Closed
2023pfm opened this issue Oct 9, 2024 · 6 comments
Assignees
Labels
ai connector Anything related to AI connectors bug Something isn't working

Comments

@2023pfm
Copy link

2023pfm commented Oct 9, 2024

The latest library of SemanticKernel does not support Azure o1 series models.
Reason for error: Unsupported parameter: 'max_tokens' is not supported with this model Use 'max_completion_tokens' instead.
Additionally, calling the tool may also result in an error stating that the tool is not supported.
What should I do now, thank you.

@2023pfm 2023pfm added the bug Something isn't working label Oct 9, 2024
@markwallace-microsoft markwallace-microsoft added ai connector Anything related to AI connectors and removed triage labels Oct 9, 2024
@RogerBarreto
Copy link
Member

@2023pfm, can you provide more context for this issue?

What was the version used, some code for reproduction?

Currently we are using the new property from OpenAI SDK which already supports the new max_completion_tokens

Open SDK using the new name behind the scenes as you can see in here:
https://github.com/openai/openai-dotnet/blob/c49dd7065215bc0d094c7f79ccd634a38f0d7b66/src/Custom/Chat/ChatCompletionOptions.cs#L156
There you can also see their comments on the deprecated maxtokens.

So everything points me that you may be using an older release of our packages and that problem should not happen when using SK 1.22 moving forward.

Let me know if you still have problems with the latest 1.22 and more details about your error and problem.

@github-project-automation github-project-automation bot moved this from Bug to Sprint: Done in Semantic Kernel Oct 9, 2024
@2023pfm
Copy link
Author

2023pfm commented Oct 10, 2024

Image

How to implement MaxOutputTokenCount in the Azure OpenAiPomptExecutionSettings class.

@fwaris
Copy link

fwaris commented Nov 21, 2024

running into the same issue on Azure.

MaxTokens is not accepted for o1 models.

o1 supports a new parameter instead called 'max_completion_tokens'

@Vifill
Copy link

Vifill commented Dec 16, 2024

I have the same issue. Using o1, Azure and AzureOpenAIPromptExecutionSettings on version 1.31.

Note that it works if I don't set the MaxTokens in the AzureOpenAIPromptExecutionSettings, but once I try to set the MaxTokens, it fails.

@markwallace-microsoft
Copy link
Member

@RogerBarreto reopening to address the last two comments

@RogerBarreto
Copy link
Member

RogerBarreto commented Feb 5, 2025

This issue is a duplicated from the issue below (since this one has less data I'm closing this) consider the following issue:

TLDR: Azure SDK currently don't support o1 models and fails with the above message when attempting to providing max tokens. Other limitations were also identified like using the latest API preview versions.

I provided a workaround posted here using a custom http handler that may be able to use SK against that model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai connector Anything related to AI connectors bug Something isn't working
Projects
Status: Bug
Development

No branches or pull requests

5 participants