Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for IPEX/Intel GPUs #718

Open
hanthor opened this issue Feb 3, 2025 · 4 comments
Open

Support for IPEX/Intel GPUs #718

hanthor opened this issue Feb 3, 2025 · 4 comments

Comments

@hanthor
Copy link

hanthor commented Feb 3, 2025

The proper bits for supporting Intel GPUs for running ollama and others exists here: https://github.com/intel/ipex-llm/tree/main/docker/llm

Just need someone with the expertise to add the right stuff to support it

@ericcurtin
Copy link
Collaborator

@cgruver had joy with this container image, not sure if it's the same type of acceleration:

https://github.com/containers/ramalama/blob/main/container-images/intel-gpu/Containerfile

we would need someone from the community with the hardware to test, expertise, etc. to open a PR.

@cgruver
Copy link
Collaborator

cgruver commented Feb 3, 2025

@hanthor

ramalama --gpu --imagequay.io/cgruver0/llama-cpp-intel-gpu:latest run granite

@cgruver
Copy link
Collaborator

cgruver commented Feb 3, 2025

@hanthor I've tested this on Intel 155H with Arc GPU. It's actually pretty snappy.

also... JFYI, that particular image is using OpenCL. I have a refactor on the way that will use Level Zero. It's a bit faster than OpenCL.

@hanthor
Copy link
Author

hanthor commented Feb 5, 2025

@cgruver had joy with this container image, not sure if it's the same type of acceleration:

https://github.com/containers/ramalama/blob/main/container-images/intel-gpu/Containerfile

we would need someone from the community with the hardware to test, expertise, etc. to open a PR.

This worked great! I'm wondering about these NPUs now... maybe running Whisper on it? still haven't come up with a decent application for these NPUs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants