[PR #1844] [CLOSED] Workaround memory memory limitations #41950

Closed
opened 2026-04-24 21:45:22 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/1844
Author: @dhiltgen
Created: 1/7/2024
Status: Closed

Base: mainHead: cuda_memory


📝 Commits (1)

  • 7cf53fc Workaround memory memory limitations

📊 Changes

1 file changed (+2 additions, -2 deletions)

View changed files

📝 gpu/gpu.go (+2 -2)

📄 Description

This isn't a proper fix, but until we more completely calculate memory requirements, this seems to avoid crashes when approaching the limit on smaller memory CUDA GPUs.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/1844 **Author:** [@dhiltgen](https://github.com/dhiltgen) **Created:** 1/7/2024 **Status:** ❌ Closed **Base:** `main` ← **Head:** `cuda_memory` --- ### 📝 Commits (1) - [`7cf53fc`](https://github.com/ollama/ollama/commit/7cf53fc20b9ac26b3c28df76f3f4c2876a95b92d) Workaround memory memory limitations ### 📊 Changes **1 file changed** (+2 additions, -2 deletions) <details> <summary>View changed files</summary> 📝 `gpu/gpu.go` (+2 -2) </details> ### 📄 Description This isn't a proper fix, but until we more completely calculate memory requirements, this seems to avoid crashes when approaching the limit on smaller memory CUDA GPUs. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-24 21:45:22 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#41950