[PR #10694] [MERGED] Re-remove cuda v11 #60025

Closed
opened 2026-04-29 14:56:45 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/10694
Author: @dhiltgen
Created: 5/13/2025
Status: Merged
Merged: 6/23/2025
Merged by: @dhiltgen

Base: mainHead: size_v11


📝 Commits (4)

  • 0d0c43d Re-remove cuda v11
  • aa676fe Simplify layout
  • 89b18f0 distinct sbsa variant for linux arm64
  • 6da73ce temporary prevent rocm+cuda mixed loading

📊 Changes

14 files changed (+67 additions, -66 deletions)

View changed files

📝 .github/workflows/release.yaml (+0 -7)
📝 .github/workflows/test.yaml (+3 -3)
📝 CMakeLists.txt (+7 -4)
📝 CMakePresets.json (+0 -13)
📝 Dockerfile (+7 -17)
📝 discover/cuda_common.go (+4 -0)
📝 discover/path.go (+1 -1)
📝 docs/gpu.md (+1 -1)
📝 docs/troubleshooting.md (+1 -1)
llama/patches/0018-temporary-prevent-rocm-cuda-mixed-loading.patch (+32 -0)
📝 llm/server.go (+1 -1)
📝 ml/backend/ggml/ggml/src/ggml-backend-reg.cpp (+10 -2)
📝 scripts/build_windows.ps1 (+0 -14)
📝 scripts/env.sh (+0 -2)

📄 Description

Revert the revert - drop v11 support requiring drivers newer than Feb 23

This reverts commit c6bcdc4223.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/10694 **Author:** [@dhiltgen](https://github.com/dhiltgen) **Created:** 5/13/2025 **Status:** ✅ Merged **Merged:** 6/23/2025 **Merged by:** [@dhiltgen](https://github.com/dhiltgen) **Base:** `main` ← **Head:** `size_v11` --- ### 📝 Commits (4) - [`0d0c43d`](https://github.com/ollama/ollama/commit/0d0c43d9e1737df36ce1250c53a361056bf0a0c5) Re-remove cuda v11 - [`aa676fe`](https://github.com/ollama/ollama/commit/aa676fefe80454b7349c15096804c9104447e609) Simplify layout - [`89b18f0`](https://github.com/ollama/ollama/commit/89b18f0bb0a60bc06703cd20375a868cbf62970f) distinct sbsa variant for linux arm64 - [`6da73ce`](https://github.com/ollama/ollama/commit/6da73ce2cae91c76922ead726ba43450dcc50af6) temporary prevent rocm+cuda mixed loading ### 📊 Changes **14 files changed** (+67 additions, -66 deletions) <details> <summary>View changed files</summary> 📝 `.github/workflows/release.yaml` (+0 -7) 📝 `.github/workflows/test.yaml` (+3 -3) 📝 `CMakeLists.txt` (+7 -4) 📝 `CMakePresets.json` (+0 -13) 📝 `Dockerfile` (+7 -17) 📝 `discover/cuda_common.go` (+4 -0) 📝 `discover/path.go` (+1 -1) 📝 `docs/gpu.md` (+1 -1) 📝 `docs/troubleshooting.md` (+1 -1) ➕ `llama/patches/0018-temporary-prevent-rocm-cuda-mixed-loading.patch` (+32 -0) 📝 `llm/server.go` (+1 -1) 📝 `ml/backend/ggml/ggml/src/ggml-backend-reg.cpp` (+10 -2) 📝 `scripts/build_windows.ps1` (+0 -14) 📝 `scripts/env.sh` (+0 -2) </details> ### 📄 Description Revert the revert - drop v11 support requiring drivers newer than Feb 23 This reverts commit c6bcdc4223c50071b59a19c42cc54ec9932f696f. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-29 14:56:45 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#60025