[PR #13000] Add ROCm 7 support #12774

Open
opened 2025-11-12 17:06:31 -06:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/13000
Author: @phueper
Created: 11/7/2025
Status: 🔄 Open

Base: mainHead: ollama_main_rocm7


📝 Commits (5)

  • 6fec18c add ROCm 7 preset
  • 5a4c2cc include rocm dependencies in install
  • 1671796 add a rocm_v7 build to the rocm FLAVOR
  • 4061bd1 update runtime to Ubuntu 25.10
  • 3cc1595 clean Dockefile

📊 Changes

3 files changed (+49 additions, -6 deletions)

View changed files

📝 CMakeLists.txt (+1 -1)
📝 CMakePresets.json (+15 -1)
📝 Dockerfile (+33 -4)

📄 Description

fixes #12734

add a rocm_v7 build in addition to rocm_v6

similar to the cuda-xx and jetpack-xx builds

we created and tested this in the https://github.com/rjmalagon/ollama-linux-amd-apu "non-fork" ... i can confirm it works on my gfx1151 based AMD RYZEN AI MAX+ 395 w/ Radeon 8060S and gfx1152 based AMD Ryzen AI 5 340 w/ Radeon 840M fallback to vulkan also works and i would expect a fallback from rocm_v7 to rocm_v6 to work as well if a hardware is only supported by rocm_v6


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/13000 **Author:** [@phueper](https://github.com/phueper) **Created:** 11/7/2025 **Status:** 🔄 Open **Base:** `main` ← **Head:** `ollama_main_rocm7` --- ### 📝 Commits (5) - [`6fec18c`](https://github.com/ollama/ollama/commit/6fec18cc99e1788a536f7fb4f10e2fa4cc8e512b) add ROCm 7 preset - [`5a4c2cc`](https://github.com/ollama/ollama/commit/5a4c2cc17980811254806c43a7f2738ba9c89aa5) include rocm dependencies in install - [`1671796`](https://github.com/ollama/ollama/commit/1671796d08b00fc4e33970d77ca3d873f93b4f0d) add a rocm_v7 build to the rocm FLAVOR - [`4061bd1`](https://github.com/ollama/ollama/commit/4061bd1e2dffa606ba8dbb7ad8eca96931e9179f) update runtime to Ubuntu 25.10 - [`3cc1595`](https://github.com/ollama/ollama/commit/3cc15954b3f1915893d7659c2f6081aa29c2ded5) clean Dockefile ### 📊 Changes **3 files changed** (+49 additions, -6 deletions) <details> <summary>View changed files</summary> 📝 `CMakeLists.txt` (+1 -1) 📝 `CMakePresets.json` (+15 -1) 📝 `Dockerfile` (+33 -4) </details> ### 📄 Description fixes #12734 add a rocm_v7 build in addition to rocm_v6 similar to the cuda-xx and jetpack-xx builds we created and tested this in the https://github.com/rjmalagon/ollama-linux-amd-apu "non-fork" ... i can confirm it works on my gfx1151 based AMD RYZEN AI MAX+ 395 w/ Radeon 8060S and gfx1152 based AMD Ryzen AI 5 340 w/ Radeon 840M fallback to vulkan also works and i would expect a fallback from rocm_v7 to rocm_v6 to work as well if a hardware is only supported by rocm_v6 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the
pull-request
label 2025-11-12 17:06:31 -06:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama-ollama#12774
No description provided.