Parse a custom response message and use it instead. #11456
Replies: 1 comment
-
|
This discussion was automatically closed because the community moved to community.vercel.com/ai-sdk |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Acknowledgement
Question
We get the method to prepare the send message schema / format for the request message using prepareSendMessage() function in useChat(), but I was wondering if there was a similar way for the response messages as well. Currently the backend system doesn't support text streaming and gives the response in one-go. Would be super helpful if anyone has done it.
I am currently using AI SDK in frontend and for backend using Ollama with FastAPI. Wanted to checkout on text stream protocol but seems like the example has been deprecated as of AI SDK v6.
Request format:
{
query: "" (string)
url: "" (string)
}
Response format:
{
response: "" (string)
references: [""] (string[])
}
Beta Was this translation helpful? Give feedback.
All reactions