Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: FP8 accuracy issue #454

Open
1 task done
yuzho-amd opened this issue Feb 28, 2025 · 0 comments
Open
1 task done

[Bug]: FP8 accuracy issue #454

yuzho-amd opened this issue Feb 28, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@yuzho-amd
Copy link

Your current environment

vllm branch: llama_fp8_12062024, commit: d7fefdf
you can use this docker:
yuzho/private:ali1209_llama_fp8_12062024_d7fefdf03

🐛 Describe the bug

the output token id of fp8 models are not same when using different TP:

pastedImage_2_28_2025__9_52_30_796.jpeg

and this is the script to test models quantized by quark:

import os
from vllm import LLM, SamplingParams

if name == 'main':
llm = LLM("/mnt/raid0/pretrained_model/amd/Llama-2-7b-hf-FP8-KV", tensor_parallel_size=4, quantization="fp8", kv_cache_dtype="fp8")
sampling_params = SamplingParams(temperature=0, top_k=1)
print(llm.generate("What is batch inference?", sampling_params))

this is the script to test models quantized by vllm:

import os
from vllm import LLM, SamplingParams

if name == 'main':
llm = LLM("/mnt/raid0/pretrained_model/meta-llama/Llama-2-7b-hf", tensor_parallel_size=4, quantization="fp8", kv_cache_dtype="fp8")
sampling_params = SamplingParams(temperature=0, top_k=1)
print(llm.generate("What is batch inference?", sampling_params))

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@yuzho-amd yuzho-amd added the bug Something isn't working label Feb 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant