[GH-ISSUE #2972] How to run Windows version Ollama on AMD GPU? #63864

Closed
opened 2026-05-03 15:16:34 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @TM119 on GitHub (Mar 7, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2972

I have a W6800, apparently windows version Ollama is running models on CPU rather than GPU. Will AMD GPU be supported?

Originally created by @TM119 on GitHub (Mar 7, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2972 I have a W6800, apparently windows version Ollama is running models on CPU rather than GPU. Will AMD GPU be supported?
Author
Owner

@pdevine commented on GitHub (Mar 7, 2024):

Closing as a dupe of #2598

<!-- gh-comment-id:1982657727 --> @pdevine commented on GitHub (Mar 7, 2024): Closing as a dupe of #2598
Author
Owner

@ravarcade commented on GitHub (Mar 7, 2024):

@TM119 You can test build from dhiltgen:
https://github.com/dhiltgen/ollama/releases
I just tested it on my 7800XT.
Don't forget to install ROCm 5.7.

<!-- gh-comment-id:1984126146 --> @ravarcade commented on GitHub (Mar 7, 2024): @TM119 You can test build from dhiltgen: https://github.com/dhiltgen/ollama/releases I just tested it on my 7800XT. Don't forget to install ROCm 5.7.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#63864