[PR #13000] [CLOSED] Add ROCm 7 support #14037

Closed
opened 2026-04-13 00:43:07 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/13000
Author: @phueper
Created: 11/7/2025
Status: Closed

Base: mainHead: ollama_main_rocm7


📝 Commits (8)

📊 Changes

3 files changed (+50 additions, -7 deletions)

View changed files

📝 CMakeLists.txt (+1 -1)
📝 CMakePresets.json (+15 -1)
📝 Dockerfile (+34 -5)

📄 Description

fixes #12734

add a rocm_v7 build in addition to rocm_v6

similar to the cuda-xx and jetpack-xx builds

we created and tested this in the https://github.com/rjmalagon/ollama-linux-amd-apu "non-fork" ... i can confirm it works on my gfx1151 based AMD RYZEN AI MAX+ 395 w/ Radeon 8060S and gfx1152 based AMD Ryzen AI 5 340 w/ Radeon 840M fallback to vulkan also works and i would expect a fallback from rocm_v7 to rocm_v6 to work as well if a hardware is only supported by rocm_v6


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/13000 **Author:** [@phueper](https://github.com/phueper) **Created:** 11/7/2025 **Status:** ❌ Closed **Base:** `main` ← **Head:** `ollama_main_rocm7` --- ### 📝 Commits (8) - [`4af2080`](https://github.com/ollama/ollama/commit/4af20809f033db8615069eacaf870bcceaf3056d) add ROCm 7 preset - [`ef64a0a`](https://github.com/ollama/ollama/commit/ef64a0a10e16494ead8dac69d6278cebc033995f) include rocm dependencies in install - [`a37c08d`](https://github.com/ollama/ollama/commit/a37c08df4ea020eeb4a656847aac0a3d528d1422) add a rocm_v7 build to the rocm FLAVOR - [`c9623fe`](https://github.com/ollama/ollama/commit/c9623fed99525bdd038af84a87c4a8bef85c4dd6) update runtime to Ubuntu 25.10 - [`e7e7889`](https://github.com/ollama/ollama/commit/e7e7889d4fa32458ee2630f5b3897b7c8aa8973f) clean Dockefile - [`b240a36`](https://github.com/ollama/ollama/commit/b240a36e6cbc0440f99f4f96275b0c6054cff809) update to ROCm 7.1.1 - [`ce556ad`](https://github.com/ollama/ollama/commit/ce556addbbc820f2f707cd690e8668896826801b) sync with upstream, - [`7860c17`](https://github.com/ollama/ollama/commit/7860c17947e28684a6f05f5f3f9c87cf515549be) update to ROCm 7.2 ### 📊 Changes **3 files changed** (+50 additions, -7 deletions) <details> <summary>View changed files</summary> 📝 `CMakeLists.txt` (+1 -1) 📝 `CMakePresets.json` (+15 -1) 📝 `Dockerfile` (+34 -5) </details> ### 📄 Description fixes #12734 add a rocm_v7 build in addition to rocm_v6 similar to the cuda-xx and jetpack-xx builds we created and tested this in the https://github.com/rjmalagon/ollama-linux-amd-apu "non-fork" ... i can confirm it works on my gfx1151 based AMD RYZEN AI MAX+ 395 w/ Radeon 8060S and gfx1152 based AMD Ryzen AI 5 340 w/ Radeon 840M fallback to vulkan also works and i would expect a fallback from rocm_v7 to rocm_v6 to work as well if a hardware is only supported by rocm_v6 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-13 00:43:07 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#14037