[PR #2243] [MERGED] Harden for zero detected GPUs #57526

Closed
opened 2026-04-29 12:10:36 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/2243
Author: @dhiltgen
Created: 1/28/2024
Status: Merged
Merged: 1/28/2024
Merged by: @dhiltgen

Base: mainHead: harden_zero_gpus


📝 Commits (1)

  • f07f8b7 Harden for zero detected GPUs

📊 Changes

1 file changed (+2 additions, -2 deletions)

View changed files

📝 gpu/gpu.go (+2 -2)

📄 Description

At least with the ROCm libraries, its possible to have the library present with zero GPUs. This fix avoids a divide by zero bug in llm.go when we try to calculate GPU memory with zero GPUs.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/2243 **Author:** [@dhiltgen](https://github.com/dhiltgen) **Created:** 1/28/2024 **Status:** ✅ Merged **Merged:** 1/28/2024 **Merged by:** [@dhiltgen](https://github.com/dhiltgen) **Base:** `main` ← **Head:** `harden_zero_gpus` --- ### 📝 Commits (1) - [`f07f8b7`](https://github.com/ollama/ollama/commit/f07f8b7a9ed8cd8c07860a5b7852702ef9737429) Harden for zero detected GPUs ### 📊 Changes **1 file changed** (+2 additions, -2 deletions) <details> <summary>View changed files</summary> 📝 `gpu/gpu.go` (+2 -2) </details> ### 📄 Description At least with the ROCm libraries, its possible to have the library present with zero GPUs. This fix avoids a divide by zero bug in llm.go when we try to calculate GPU memory with zero GPUs. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-29 12:10:36 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#57526