-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WARN llama-server <chat> exited with status code 1 #3512
Comments
Same here. Also tried 0.20 |
Same. Version 0.23. |
it's likely the CPU of the host doesn't support avx2 - thus cause the issue. We shall at least provide proper error log for such case. Filing #3694 |
My system has Intel© Core™ i7-4700HQ which Intel says does have the avx2 feature: Note that this error only appears for me if I try to use the |
Hi @JamesNewton, you might already be aware, but just to ensure clarity, running Tabby with Vulkan requires a Vulkan setup on your system. Can you confirm that this is in place? If so, could you please provide more details about your system, such as the GPU, OS, and other relevant specifications? |
Describe the bug
I am trying to run tabby but I get:
Information about your version
0.21
Ideally when there is this output tabby should exit and not still trying to start and generate the command to try with the same parameter or atleast a log output so it is possible to investigate as copy and paste that output doesn't work as it is escaped.
The text was updated successfully, but these errors were encountered: