[GH-ISSUE #2694] Add another binary that the linux install script could use on ROCm accelerated systems. #1608

Closed
opened 2026-04-12 11:32:13 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @TimTheBig on GitHub (Feb 22, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2694

Originally assigned to: @dhiltgen on GitHub.

Another binary that the install script could use on ROCm accelerated systems would be useful. Releases are not compiled with HIP, therefore non-NVidia GPU acceleration support is not present. https://github.com/ollama/ollama/issues/2685#issuecomment-1959937668

Originally created by @TimTheBig on GitHub (Feb 22, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2694 Originally assigned to: @dhiltgen on GitHub. Another binary that the install script could use on `ROCm` accelerated systems would be useful. Releases are not compiled with `HIP`, therefore *non-NVidia* GPU acceleration support is not present. https://github.com/ollama/ollama/issues/2685#issuecomment-1959937668
Author
Owner

@sid-cypher commented on GitHub (Feb 23, 2024):

Erm, the end of my comment was a question, not a statement. I personally feel that it would be disrespectful towards the esteemed experts and maintainers to swamp them with newly-opened issues based on unverified assumptions, we should do some of the legwork first.

Release v0.1.27 seems to work with AMD ROCm out of the box, and the script just installs a release.
So the issue boils downs to the version that the download URL https://ollama.com/download/ollama-linux-$ARCH currently points to.

<!-- gh-comment-id:1961352553 --> @sid-cypher commented on GitHub (Feb 23, 2024): Erm, the end of my comment was a question, not a statement. I personally feel that it would be disrespectful towards the esteemed experts and maintainers to swamp them with newly-opened issues based on unverified assumptions, we should do some of the legwork first. Release v0.1.27 seems to work with AMD ROCm out of the box, and the script just installs a release. So the issue boils downs to the version that the download URL `https://ollama.com/download/ollama-linux-$ARCH` currently points to.
Author
Owner

@TimTheBig commented on GitHub (Feb 23, 2024):

I see the download URL needs/needed to be updated.

<!-- gh-comment-id:1961775569 --> @TimTheBig commented on GitHub (Feb 23, 2024): I see the download URL needs/needed to be updated.
Author
Owner

@dhiltgen commented on GitHub (Mar 12, 2024):

This should be addressed 0.1.29

We'll detect a Radeon system, check for ROCm v6, and if not found, download a minimal dependency set of ROCm for Ollama to run.

<!-- gh-comment-id:1989672754 --> @dhiltgen commented on GitHub (Mar 12, 2024): This should be addressed [0.1.29 ](https://github.com/ollama/ollama/releases/tag/v0.1.29) We'll detect a Radeon system, check for ROCm v6, and if not found, download a minimal dependency set of ROCm for Ollama to run.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1608