-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.Net: FunctionResult serializer should use JsonSerializerOptions.Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping
#10389
Labels
.NET
Issue or Pull requests regarding .NET code
Comments
JsonSerializerOptions.Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping
JsonSerializerOptions.Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping
JsonSerializerOptions.Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping
JsonSerializerOptions.Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping
No, it is .NET issue. |
Thanks for your response! |
JsonSerializerOptions.Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping
JsonSerializerOptions.Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping
@dmytrostruk can you take a look? |
RamType0
added a commit
to RamType0/semantic-kernel
that referenced
this issue
Feb 5, 2025
Use `JsonSerializerOptions.Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping`
github-merge-queue bot
pushed a commit
that referenced
this issue
Feb 10, 2025
### Motivation and Context Fixes issue #10389. ### Description Use `JsonSerializerOptions.Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping` to generate more LLM friendly serialized FunctionResult. ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [x] All unit tests pass, and I have added new tests where possible - [x] I didn't break anyone 😄 --------- Co-authored-by: SergeyMenshykh <[email protected]>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
When I am using KernelFunction which returns user defined object, then it is called by Auto function invocation,
the result of the kernel function will be converted to string with JsonSerializer.Serialize via FunctionCallsProcessor.ProcessFunctionResult.
semantic-kernel/dotnet/src/InternalUtilities/connectors/AI/FunctionCalling/FunctionCallsProcessor.cs
Lines 479 to 494 in f8592ad
Because JsonSerializer.Serialize very likely to escape characters by default.
if I have this kind of record,
the content of the x is ...
But at least, Azure OpenAI gpt-4o (2024-08-06) could not handle this kind of escaped characters.
But if I use
The content of the y is ...
So it will be easily handled by LLM.
Thus, IMHO, FunctionCallsProcessor.ProcessFunctionResult should use
new JsonSerializerOptions(JsonSerializerOptions.Default) { Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping, }
for JsonSerializer.Serialize by default.Also, even if we pass JsonSerializerOptions to IKernelBuilderPlugins AddFromType, it is not used for result serialization.
The text was updated successfully, but these errors were encountered: