You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When two users make a request for the streaming endpoint at the same time, user A's stream will contain chat engine events from user B's stream. This leaks user B's message to user A, if the events contain the message as a context query:
{
"type": "events",
"data": {
"title": "Retrieving context for query: 'Show me articles related to machine learning.\n'"
}
}
A workaround is to remove the event_handler from VercelStreamResponse. Ideally these chat engine events should only be emitted to the request that triggered them, but i am not sure if that is possible.
The text was updated successfully, but these errors were encountered:
Hi @Tetr4 , How did you run the API? Theoretically, if you start the FastAPI app with Uvicorn, each user request is handled by a single worker, so there should be no data race.
When two users make a request for the streaming endpoint at the same time, user A's stream will contain chat engine events from user B's stream. This leaks user B's message to user A, if the events contain the message as a context query:
A workaround is to remove the
event_handler
fromVercelStreamResponse
. Ideally these chat engine events should only be emitted to the request that triggered them, but i am not sure if that is possible.The text was updated successfully, but these errors were encountered: