Skip to content

Conversation

@evaline-ju
Copy link
Collaborator

@evaline-ju evaline-ju commented Dec 12, 2025

Closes: #107 - breaking API and related vLLM PRs are linked there

Used api server info at time of writing https://github.com/vllm-project/vllm/blob/e83b7e379c11bf136c1b96bc6a67b6d2207cfde4/vllm/entrypoints/openai/api_server.py and test mock objects from https://github.com/vllm-project/vllm/blob/8f8fda261a620234fdeea338f44093d5d8072879/tests/entrypoints/openai/test_serving_chat.py

Some of the try/except ImportError statements have been simplified since those were helping with backwards compatibility for older versions, but the amount of breaking API changes in 0.11.1 would involve too many backwards compatibility updates in the adapter

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

vLLM 0.11.1+ compatibility

1 participant