[GH-ISSUE #6607] docker image for rocm-3.5.1 to run on older AMD gpus #50674

Closed
opened 2026-04-28 16:46:27 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @drhboss on GitHub (Sep 3, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6607

Originally assigned to: @dhiltgen on GitHub.

any chance an image with rocm-3.5.1 can be prepared for older gpus, i.e rx580?

Originally created by @drhboss on GitHub (Sep 3, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6607 Originally assigned to: @dhiltgen on GitHub. any chance an image with rocm-3.5.1 can be prepared for older gpus, i.e rx580?
GiteaMirror added the feature request label 2026-04-28 16:46:27 -05:00
Author
Owner

@dhiltgen commented on GitHub (Sep 3, 2024):

I tried a couple different old rocm base images from Docker Hub and it doesn't look like this is straight forward. The header layout has changed quite a bit, and llama.cpp assumes the newer layout. Even with some symlinks to try to get the headers found, things wont compile based on missing definitions.

We're tracking support for these older GPUs via #2453

<!-- gh-comment-id:2327424041 --> @dhiltgen commented on GitHub (Sep 3, 2024): I tried a couple different old rocm base images from Docker Hub and it doesn't look like this is straight forward. The header layout has changed quite a bit, and llama.cpp assumes the newer layout. Even with some symlinks to try to get the headers found, things wont compile based on missing definitions. We're tracking support for these older GPUs via #2453
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#50674