Does whisper.cpp suppport GPU selection on a multi-GPU setup? #2861
Unanswered
MislavJuric
asked this question in
Q&A
Replies: 1 comment
-
You can use # use GPU 0
CUDA_VISIBLE_DEVICES=0 whisper-cli ...
# use GPU 1
CUDA_VISIBLE_DEVICES=1 whisper-cli ... |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello whisper.cpp developers,
I wanted to know whether whisper.cpp suppports GPU selection on a multi-GPU setup?
What I mean by this is if I have a multi-GPU system (let's say 2 GPUs), can I somehow tell whisper.cpp to load the model on the first or on the second GPU with some parameter? I know that llama.cpp has the parameters
--device cuda_0 --split_mode none
(--split-mode none
is specific for llama.cpp), so I'm wondering if there's some equivalent command line arguments in whisper.cpp.If there are none, are there any ways I can pick on which GPU should whisper.cpp run?
Thank you in advance!
Beta Was this translation helpful? Give feedback.
All reactions