The legacy LLMEngine in offline inference #14822
Unanswered
RookieChenTaoYu
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In the v1 version: /vllm/vllm/v1/engine/llm_engine.py this file, where written under the class LLMEngine:"Legacy LLMEngine for backwards compatibility.". So if this file llm_engine.py is legacy, then what will be used for the offline inference in vllm?

Beta Was this translation helpful? Give feedback.
All reactions