[PR #12517] [MERGED] implement nvml for linux #13854

Closed
opened 2026-04-13 00:38:39 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/12517
Author: @dhiltgen
Created: 10/6/2025
Status: Merged
Merged: 10/10/2025
Merged by: @dhiltgen

Base: mainHead: wsl_nvml


📝 Commits (2)

  • 3301450 implement nvml for linux
  • ea7abf8 Improve scheduler logging when VRAM doesn't recover

📊 Changes

3 files changed (+125 additions, -42 deletions)

View changed files

📝 llama/patches/0026-GPU-discovery-enhancements.patch (+55 -18)
📝 ml/backend/ggml/ggml/src/mem_nvml.cpp (+44 -7)
📝 server/sched.go (+26 -17)

📄 Description

Verified on WSL2 and plain linux systems.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/12517 **Author:** [@dhiltgen](https://github.com/dhiltgen) **Created:** 10/6/2025 **Status:** ✅ Merged **Merged:** 10/10/2025 **Merged by:** [@dhiltgen](https://github.com/dhiltgen) **Base:** `main` ← **Head:** `wsl_nvml` --- ### 📝 Commits (2) - [`3301450`](https://github.com/ollama/ollama/commit/3301450375429113ce1999067993d7760d45769b) implement nvml for linux - [`ea7abf8`](https://github.com/ollama/ollama/commit/ea7abf82d09b49624f3467c7090a702521b22fb1) Improve scheduler logging when VRAM doesn't recover ### 📊 Changes **3 files changed** (+125 additions, -42 deletions) <details> <summary>View changed files</summary> 📝 `llama/patches/0026-GPU-discovery-enhancements.patch` (+55 -18) 📝 `ml/backend/ggml/ggml/src/mem_nvml.cpp` (+44 -7) 📝 `server/sched.go` (+26 -17) </details> ### 📄 Description Verified on WSL2 and plain linux systems. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-13 00:38:39 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#13854