Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OllamaChatClient doesn't check for status code before deserializing. #5820

Closed
eiriktsarpalis opened this issue Jan 27, 2025 · 0 comments · Fixed by #5821
Closed

OllamaChatClient doesn't check for status code before deserializing. #5820

eiriktsarpalis opened this issue Jan 27, 2025 · 0 comments · Fixed by #5821
Assignees
Labels
ai-ready-to-implement area-AI bug This issue describes a behavior which is not expected - a bug. work in progress 🚧

Comments

@eiriktsarpalis
Copy link
Member

Description

Per the title, the client currently doesn't check response status codes before deserializing in both the non-streaming

var response = (await httpResponse.Content.ReadFromJsonAsync(
JsonContext.Default.OllamaChatResponse,
cancellationToken).ConfigureAwait(false))!;
if (!string.IsNullOrEmpty(response.Error))
{
throw new InvalidOperationException($"Ollama error: {response.Error}");
}

and streaming implementations

using var httpResponse = await _httpClient.SendAsync(request, HttpCompletionOption.ResponseHeadersRead, cancellationToken).ConfigureAwait(false);
using var httpResponseStream = await httpResponse.Content
#if NET
.ReadAsStreamAsync(cancellationToken)
#else
.ReadAsStreamAsync()
#endif
.ConfigureAwait(false);

In practice, this results in one single empty chat update being returned in cases where, for example, Ollama returns a 404 response. We should fix this and ensure that OllamaSharp doesn't suffer from the same issue.

Reproduction Steps

N/A

Expected behavior

Should throw an exception when non 20x responses are being produced.

Actual behavior

Does not throw an exception when non-20x responses are being produced.

Regression?

No response

Known Workarounds

No response

Configuration

No response

Other information

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai-ready-to-implement area-AI bug This issue describes a behavior which is not expected - a bug. work in progress 🚧
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant