[PR #6201] feat: add support for running ollama on rocm in wsl #10789

Open
opened 2025-11-12 15:36:41 -06:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/6201
Author: @evshiron
Created: 8/6/2024
Status: 🔄 Open

Base: mainHead: rocm-wsl-support


📝 Commits (3)

  • 36b63ef feat: add support for running ollama on rocm in wsl
  • 9e75f17 refactor: clean up gpu/amd_hip_*.go
  • b5bea09 Merge branch 'main' into rocm-wsl-support

📊 Changes

4 files changed (+320 additions, -23 deletions)

View changed files

gpu/amd_hip_common.go (+23 -0)
gpu/amd_hip_linux.go (+159 -0)
📝 gpu/amd_hip_windows.go (+9 -22)
📝 gpu/amd_linux.go (+129 -1)

📄 Description

Allow running Ollama on ROCm in WSL by calling HIP functions instead of querying sysfs.

The amd_hip_linux.go was duplicated from amd_hip_windows.go, windows.LoadLibrary and syscall.SyscallN are replaced with CGO and dlfcn.h, to avoid depending on the HIP runtime directly.

Finally, I add an alternative routine for RocmGPUInfo: if existing method could not find any AMD GPUs, it will give the new method a try.

Please note, that the code changes haven't been tested outside of WSL.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/6201 **Author:** [@evshiron](https://github.com/evshiron) **Created:** 8/6/2024 **Status:** 🔄 Open **Base:** `main` ← **Head:** `rocm-wsl-support` --- ### 📝 Commits (3) - [`36b63ef`](https://github.com/ollama/ollama/commit/36b63efd007a284a062bece0cce17b10dc0d2def) feat: add support for running ollama on rocm in wsl - [`9e75f17`](https://github.com/ollama/ollama/commit/9e75f17e19800a91c75dffd620be79b95803a12b) refactor: clean up gpu/amd_hip_*.go - [`b5bea09`](https://github.com/ollama/ollama/commit/b5bea0994c851dd76eb20a5c74a93aac787beab2) Merge branch 'main' into rocm-wsl-support ### 📊 Changes **4 files changed** (+320 additions, -23 deletions) <details> <summary>View changed files</summary> ➕ `gpu/amd_hip_common.go` (+23 -0) ➕ `gpu/amd_hip_linux.go` (+159 -0) 📝 `gpu/amd_hip_windows.go` (+9 -22) 📝 `gpu/amd_linux.go` (+129 -1) </details> ### 📄 Description Allow running Ollama on ROCm in WSL by calling HIP functions instead of querying sysfs. The `amd_hip_linux.go` was duplicated from `amd_hip_windows.go`, `windows.LoadLibrary` and `syscall.SyscallN` are replaced with CGO and `dlfcn.h`, to avoid depending on the HIP runtime directly. Finally, I add an alternative routine for `RocmGPUInfo`: if existing method could not find any AMD GPUs, it will give the new method a try. Please note, that the code changes haven't been tested outside of WSL. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2025-11-12 15:36:41 -06:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama-ollama#10789