[GH-ISSUE #7861] Support AMD 780m #67084

Open
opened 2026-05-04 09:25:32 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @fce2 on GitHub (Nov 27, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7861

Please consider to add better AMD support (e.g. 7840u with 780m)

Originally created by @fce2 on GitHub (Nov 27, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7861 Please consider to add better AMD support (e.g. 7840u with 780m)
GiteaMirror added the feature request label 2026-05-04 09:25:32 -05:00
Author
Owner

@winstonma commented on GitHub (Dec 3, 2024):

Please check https://github.com/ollama/ollama/pull/6282

<!-- gh-comment-id:2514705799 --> @winstonma commented on GitHub (Dec 3, 2024): Please check https://github.com/ollama/ollama/pull/6282
Author
Owner

@fce2 commented on GitHub (Dec 3, 2024):

thanks... however...

i tried ollama 0.4.7 (on win11) with:

HSA_OVERRIDE_GFX_VERSION=11.0.1
OLLAMA_MAX_LOADED_MODELS=1
OLLAMA_NUM_PARALLEL=1
OLLAMA_KEEP_ALIVE=3600

but it still runs 100% on cpu.
thats what the taskmanager says and "ollama ps".

it look like there is a version for amd available: https://github.com/likelovewant/ollama-for-amd/releases
i'll try that one.

edit: also not working here, still cpu

<!-- gh-comment-id:2515029519 --> @fce2 commented on GitHub (Dec 3, 2024): thanks... however... i tried ollama 0.4.7 (on win11) with: ``` HSA_OVERRIDE_GFX_VERSION=11.0.1 OLLAMA_MAX_LOADED_MODELS=1 OLLAMA_NUM_PARALLEL=1 OLLAMA_KEEP_ALIVE=3600 ``` but it still runs 100% on cpu. thats what the taskmanager says and "ollama ps". it look like there is a version for amd available: https://github.com/likelovewant/ollama-for-amd/releases i'll try that one. edit: also not working here, still cpu
Author
Owner

@winstonma commented on GitHub (Dec 4, 2024):

Currently ollama doesn't work with iGPU, you have to compile patch ollama from source. After the compilation is done then you copy the built ollama to the installation directory. Then everything should work fine.

Also I use ollama with linux so I am not sure how Win11 ollama work

<!-- gh-comment-id:2515997808 --> @winstonma commented on GitHub (Dec 4, 2024): Currently ollama doesn't work with iGPU, you have to compile patch ollama from source. After the compilation is done then you copy the built ollama to the installation directory. Then everything should work fine. Also I use ollama with linux so I am not sure how Win11 ollama work
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#67084