Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ROCm and sliding windows fixes #2033

Merged
merged 10 commits into from
Jun 10, 2024
Merged

ROCm and sliding windows fixes #2033

merged 10 commits into from
Jun 10, 2024

Conversation

fxmarty
Copy link
Contributor

@fxmarty fxmarty commented Jun 6, 2024

Fix models requiring window attention - no need to raise an error at load time in case max_input_tokens < sliding_window.

Also, updates vllm fork commit to use the fixes from ROCm/vllm#28 & fix a few rocm issues

@fxmarty fxmarty requested a review from Narsil June 6, 2024 12:40
@Narsil
Copy link
Collaborator

Narsil commented Jun 6, 2024

Fix models requiring window attention - no need to raise an error at load time. Only if context > window length at runtime.

context --max-total-tokens, right ? (We need to crash at load time, crashing randomly at runtime is a terrible UX).

@fxmarty fxmarty changed the title ROCm fixes ROCm and sliding windows fixes Jun 7, 2024
@fxmarty fxmarty requested a review from Narsil June 7, 2024 09:20
@fxmarty
Copy link
Contributor Author

fxmarty commented Jun 7, 2024

@Narsil I assume paged attention always works with sliding window

launcher/src/main.rs Outdated Show resolved Hide resolved
@fxmarty fxmarty requested a review from Narsil June 7, 2024 12:40
Copy link
Collaborator

@Narsil Narsil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@fxmarty fxmarty merged commit 9b3674d into main Jun 10, 2024
5 checks passed
@fxmarty fxmarty deleted the rocm-fixes branch June 10, 2024 07:09
@Narsil Narsil mentioned this pull request Jun 24, 2024
5 tasks
yuanwu2017 pushed a commit to yuanwu2017/tgi-gaudi that referenced this pull request Sep 26, 2024
* update vllm commit & fix models using sliding window

* update

* update commit

* fix bug where tunableop is bound to cuda graph even when cuda graph are disabled

* enable tunableop by default

* fix sliding window

* address review

* dead code

* precise comment

* is it flaky?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants