Stream response gets truncated in vercel deployment #69800
-
SummaryWhen returning a stream from route like: The content is loaded just fine locally. But when the app is deployed to vercel, the stream response always gets truncated. What I got so far is that the "Transfer-Enconding" header is not returned in the response header when deployed to vercel. Additional informationNo response ExampleNo response |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 3 replies
-
|
Hey Breno! Here are a few things you could try:
|
Beta Was this translation helpful? Give feedback.
-
|
Having the same issue. I am using the default AI Chatbot project that Vercel has built: https://github.com/vercel/ai-chatbot/tree/main |
Beta Was this translation helpful? Give feedback.
-
|
Been hitting the 60s timeout on Vercel when streaming longer responses, so I built https://www.durablr.run/ to solve it. It handles the streaming on its side without time limits, and I just collect the output from Vercel as it comes in. Makes long-running tasks way more reliable. |
Beta Was this translation helpful? Give feedback.
That issue occurred in my app due to the 'Function Max Duration' setting in the Vercel project. The default maximum duration for serverless functions is 10 seconds, but it can be increased to 60 seconds on the free tier.