You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current implementation of the ext_proc filter assumes traffic goes to the /v1/chat/completions endpoint, and the translators return an error if the request goes to a different path.
It will be a common case for existing applications to be already consuming other OpenAI API endpoints (for example, the /v1/models to get the list of available models), and having requests to those endpoints fail when going through the AI gateway makes adoption harder.
It would be great to have a mechanism to configure the ext_proc to not fail on unknown paths and allow the request instead, so that existing applications can be proxied by the ai-gateway more easily
The text was updated successfully, but these errors were encountered:
The current implementation of the
ext_proc
filter assumes traffic goes to the/v1/chat/completions
endpoint, and the translators return an error if the request goes to a different path.It will be a common case for existing applications to be already consuming other OpenAI API endpoints (for example, the
/v1/models
to get the list of available models), and having requests to those endpoints fail when going through the AI gateway makes adoption harder.It would be great to have a mechanism to configure the
ext_proc
to not fail on unknown paths and allow the request instead, so that existing applications can be proxied by the ai-gateway more easilyThe text was updated successfully, but these errors were encountered: