[PR #14543] [CLOSED] ml/backend/ggml: use HIP for VRAM query #77013

Closed
opened 2026-05-05 09:44:45 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/14543
Author: @fcui-amd
Created: 3/2/2026
Status: Closed

Base: mainHead: vram_size_wsl


📝 Commits (1)

  • 7f348b1 ml/backend/ggml: use HIP for VRAM query

📊 Changes

3 files changed (+375 additions, -101 deletions)

View changed files

llama/patches/0035-ml-backend-ggml-use-HIP-for-VRAM-query.patch (+265 -0)
📝 ml/backend/ggml/ggml/src/ggml-cuda/ggml-cuda.cu (+5 -0)
📝 ml/backend/ggml/ggml/src/mem_hip.cpp (+105 -101)

📄 Description

WSL does not expose amdgpu sysfs entries, so VRAM size cannot be read from sysfs.
Switch device memory reporting to HIP runtime APIs to get per-device total/free memory directly.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/14543 **Author:** [@fcui-amd](https://github.com/fcui-amd) **Created:** 3/2/2026 **Status:** ❌ Closed **Base:** `main` ← **Head:** `vram_size_wsl` --- ### 📝 Commits (1) - [`7f348b1`](https://github.com/ollama/ollama/commit/7f348b1fdf0061577f2b9ab038d8cd9b7ccbf2f7) ml/backend/ggml: use HIP for VRAM query ### 📊 Changes **3 files changed** (+375 additions, -101 deletions) <details> <summary>View changed files</summary> ➕ `llama/patches/0035-ml-backend-ggml-use-HIP-for-VRAM-query.patch` (+265 -0) 📝 `ml/backend/ggml/ggml/src/ggml-cuda/ggml-cuda.cu` (+5 -0) 📝 `ml/backend/ggml/ggml/src/mem_hip.cpp` (+105 -101) </details> ### 📄 Description WSL does not expose amdgpu sysfs entries, so VRAM size cannot be read from sysfs. Switch device memory reporting to HIP runtime APIs to get per-device total/free memory directly. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-05-05 09:44:45 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#77013