-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.Net: Bug: SK can't call AzureOpenAI o1/o3-mini models #10201
Comments
Relate to Azure/azure-sdk-for-net#47809 |
Hi @rwjdk, can you please make sure you are indeed sending the |
@moonbox3: As I'm using AzureOpenAI services I'm not sending a version at any point in the call (just the deployment name which in my case is "o1") I've checked the raw request to make sure and no version is mentioned in there In Azure AI Studio the version is 2024-12-17 (the only option) But taking above image I saw something odd - The target URI that you do not choose yourself (and never use in SK says api-version=2024-12-01-preview so it could be an Azure bug [I'm in swedenCentral Azure Region]) Can one in SK alter the target URI version? |
Did a quick test with the Azure.AI.OpenAI (v2.1.0) nuget package directly and it gives the same error so I guess SK Team depends on having this dependency supporting/working first :-/ |
Yes, you can use a different Azure API version. Tagging @SergeyMenshykh for a specific example on how to in .Net. |
@moonbox3 I tried that but as the underlying platform do not yet support the version it still fails with a not implemented exception :-/ Assume this is what you mean: |
It's not supported by the Azure.AI.OpenAI SDK yet - https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/openai/Azure.AI.OpenAI/src/Custom/AzureOpenAIClientOptions.cs |
Apologies for misleading @rwjdk. The ability to specify a custom API version is allowed in SK Python, but per @SergeyMenshykh, this isn't supported in the Azure.AI.OpenAI SDK yet. |
As of now, the SDK doesn't have an option for Blocking this as this depends on Azure SDK to support the version. Tracking issue: |
As a workaround you can add a Handler to your connectors thru a Important This is a breaking glass scenario and should be dropped as soon the Azure OpenAI SDK supports Usagevar overrideApiVersion = "2025-01-01-preview";
using var httpClient = new HttpClient(new AzureOverrideHandler(overrideApiVersion));
var apiKey = config["AzureOpenAI:ApiKey"]!;
var endpoint = config["AzureOpenAI:Endpoint"]!;
var kernel = Kernel.CreateBuilder()
.AddAzureOpenAIChatCompletion("o1-mini", endpoint, apiKey, httpClient: httpClient)
.Build(); Http Handler (For version override, and max token count fix)public partial class AzureOverrideHandler: HttpClientHandler
{
private string? _overrideApiVersion;
public AzureOverrideHandler(string? overrideApiVersion = null)
{
this._overrideApiVersion = overrideApiVersion;
}
protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
var options = new JsonSerializerOptions()
{
WriteIndented = true
};
using var oldContent = request.Content;
if (oldContent is not null && request.RequestUri is not null)
{
var requestBody = await oldContent.ReadAsStringAsync(cancellationToken);
if (requestBody.IndexOf("\"model\":\"o1") > 0 && requestBody.IndexOf("\"max_tokens\":") > 0)
{
requestBody = requestBody.Replace("\"max_tokens\":", "\"max_completion_tokens\":");
request.Content = new StringContent(requestBody, new MediaTypeHeaderValue("application/json"));
}
// Console.WriteLine("Request body: " + JsonSerializer.Serialize(JsonSerializer.Deserialize<JsonElement>(requestBody), options));
}
if (this._overrideApiVersion is not null && request.RequestUri is not null)
{
var requestUri = request.RequestUri.ToString();
var currentVersion = CurrentApiVersionRegex().Match(requestUri).Value;
if (!string.IsNullOrEmpty(currentVersion))
{
request.RequestUri = new Uri(requestUri.Replace(currentVersion, this._overrideApiVersion));
}
else
{
request.RequestUri = new Uri($"{requestUri}?api-version={this._overrideApiVersion}");
}
// Console.WriteLine(request.RequestUri);
}
return await base.SendAsync(request, cancellationToken);
}
[GeneratedRegex(@"\d{4}-\d{2}-\d{2}(-preview)?$")]
public static partial Regex CurrentApiVersionRegex();
} |
Cool. Thank you @RogerBarreto . Will try it out first thing tomorrow |
@RogerBarreto: as a followup for your nice workaround (sorry for the delay in answer)... just some notes on what works and does not work so far using SK against the o1 model (with the break-glass approach). No blockers personally for me and properly on the Ai.Azure.OpenAi Nuget teams side but just FYI Works:
Does not work:
|
FYI: This also work with o3-mini 👍 with same limitations (+ the option to set o3-mini's thinking mode "low", "medium" or "high") |
@rwjdk is there a way to set thinking mode in the solution provided by @RogerBarreto for both o1 and o3-mini? |
@dsfsdsdsfsds Not as far as I can see... it wors but one would guess medium is the default used |
@rwjdk Thanks for the feedback. Regarding the
Attempt to use tools with o1-mini also gives an error as tools is not supported.
Now running the plugins against the o3-mini the result was expected and the function calling triggered the function and the filter. Here's the update Use Case code how I managed to do it: var builder = Kernel.CreateBuilder()
.AddAzureOpenAIChatCompletion("o1-mini", endpoint, apiKey, httpClient: httpClient);
// Ensure you add the filter using the `IAutoFunctionInvocationFilter` interface into the `IServiceCollection`.
builder.Services.AddSingleton<IAutoFunctionInvocationFilter>(new MyAutoFunctionInvocationFilter());
var myFunction = KernelFunctionFactory.CreateFromMethod(() => DateTime.Now.ToString("G"), "CurrentDate");
builder.Plugins.Add(KernelPluginFactory.CreateFromFunctions("myPlugin", [myFunction]));
var kernel = builder.Build();
var chatService = kernel.GetRequiredService<IChatCompletionService>();
var response = await chatService.GetChatMessageContentAsync("What is the current date?",
new OpenAIPromptExecutionSettings {
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(), // Enable function calling feature
MaxTokens = 1000,
Temperature = 1
},
kernel); // Is important to provide the kernel for the functions to be provided to the AI Model.
public class MyAutoFunctionInvocationFilter : IAutoFunctionInvocationFilter
{
public async Task OnAutoFunctionInvocationAsync(AutoFunctionInvocationContext context, Func<AutoFunctionInvocationContext, Task> next)
{
Console.WriteLine("Before function invocation");
await next(context);
Console.WriteLine("After function invocation");
}
} |
New issue opened on the Azure SDK side: Azure/azure-sdk-for-net#48110 regarding o3-mini |
Yes, you can set the thinking mode with this change:
|
Describe the bug
When doing an agent.InvokeAsync call against the o1 model (version:2024-12-17) in Azure Open AI service you get error:
"HTTP 400 (BadRequest)\r\n\r\nModel o1 is enabled only for api versions 2024-12-01-preview and later"
To Reproduce
Steps to reproduce the behavior:
Expected behavior
A normal LLM respose (in this case json due to structured output)
Platform
Additional context
Issue is properly the transative use of the azure.ai.openai beta2 package. I tried manually bumping this to latest, but it gave an feature not implemented exception
The text was updated successfully, but these errors were encountered: