[GH-ISSUE #3066] CLBlast for intergrated gpu support #1888

Closed
opened 2026-04-12 11:58:50 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @joshuachris2001 on GitHub (Mar 11, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3066

is there support to compile ollama with CLBlast for a device with a integrated non AMD GPU?
I've tried compiling with: CLBlast_DIR=/usr/lib/cmake/CLBlast go generate -tags clbast ./...
yet I still get "no GPU detected"

the I-GPU I'm trying to get CLBlast to work on is a Intel HD Graphics 5500 when llama is explicitly compiled for the slight boost in speed is still helpful especially with CLIP.

Originally created by @joshuachris2001 on GitHub (Mar 11, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3066 is there support to compile ollama with CLBlast for a device with a integrated non AMD GPU? I've tried compiling with: CLBlast_DIR=/usr/lib/cmake/CLBlast go generate -tags clbast ./... yet I still get "no GPU detected" the I-GPU I'm trying to get CLBlast to work on is a `Intel HD Graphics 5500` when llama is explicitly compiled for the slight boost in speed is still helpful especially with CLIP.
GiteaMirror added the gpu label 2026-04-12 11:58:50 -05:00
Author
Owner

@jmorganca commented on GitHub (Mar 11, 2024):

Closing for https://github.com/ollama/ollama/issues/2637 - thanks for the issue!

<!-- gh-comment-id:1989553583 --> @jmorganca commented on GitHub (Mar 11, 2024): Closing for https://github.com/ollama/ollama/issues/2637 - thanks for the issue!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1888