[GH-ISSUE #7016] Ollama not utilizing Radeon RX 570 GPU - Possible incompatibility? #66503

Closed
opened 2026-05-04 07:02:52 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @X4n71uM on GitHub (Sep 28, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7016

What is the issue?

Hi Ollama team and community,

I'm running Ollama (v0.3.12) on Windows 10 and I'm unable to get it to utilize my Radeon RX 570 OC 8GB GPU (With latest Adrenaline Driver). Ollama seems to be defaulting to CPU, which is significantly slower for my use case.

Here's the relevant part of the log:
"""
time=2024-09-28T02:43:51.626+02:00 level=INFO source=gpu.go:199 msg="looking for compatible GPUs"
time=2024-09-28T02:43:51.744+02:00 level=INFO source=gpu.go:347 msg="no compatible GPUs were discovered"
"""

Is there any way to force GPU usage, perhaps through a command-line flag or configuration setting? Is my RX 570 simply too old or unsupported by Ollama's current ROCm implementation?

I'd appreciate any guidance on how to enable GPU usage or confirmation if my GPU is indeed incompatible. Even if my GPU isn't ideal for large models, any performance boost would be welcome.

Thanks in advance!

OS

Windows

GPU

AMD

CPU

AMD

Ollama version

v0.3.12

Originally created by @X4n71uM on GitHub (Sep 28, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7016 ### What is the issue? Hi Ollama team and community, I'm running Ollama (v0.3.12) on Windows 10 and I'm unable to get it to utilize my Radeon RX 570 OC 8GB GPU (With latest Adrenaline Driver). Ollama seems to be defaulting to CPU, which is significantly slower for my use case. Here's the relevant part of the log: """ time=2024-09-28T02:43:51.626+02:00 level=INFO source=gpu.go:199 msg="looking for compatible GPUs" time=2024-09-28T02:43:51.744+02:00 level=INFO source=gpu.go:347 msg="no compatible GPUs were discovered" """ Is there any way to force GPU usage, perhaps through a command-line flag or configuration setting? Is my RX 570 simply too old or unsupported by Ollama's current ROCm implementation? I'd appreciate any guidance on how to enable GPU usage or confirmation if my GPU is indeed incompatible. Even if my GPU isn't ideal for large models, any performance boost would be welcome. Thanks in advance! ### OS Windows ### GPU AMD ### CPU AMD ### Ollama version v0.3.12
GiteaMirror added the bug label 2026-05-04 07:02:53 -05:00
Author
Owner

@dajamox commented on GitHub (Sep 28, 2024):

ROCm isn't really supported on Polaris, you might want to look out for #5059 since that adds Vulkan, which should work.

<!-- gh-comment-id:2381008283 --> @dajamox commented on GitHub (Sep 28, 2024): ROCm isn't really supported on Polaris, you might want to look out for #5059 since that adds Vulkan, which should work.
Author
Owner

@dhiltgen commented on GitHub (Sep 28, 2024):

Older Radeon support is being tracked via #2453

<!-- gh-comment-id:2381014959 --> @dhiltgen commented on GitHub (Sep 28, 2024): Older Radeon support is being tracked via #2453
Author
Owner

@juwonpee commented on GitHub (Nov 19, 2024):

Wouldnt vulkan still be supported though?

<!-- gh-comment-id:2484792772 --> @juwonpee commented on GitHub (Nov 19, 2024): Wouldnt vulkan still be supported though?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66503