[PR #12931] [MERGED] Enable Vulkan with a temporary opt-in setting #45248

Closed
opened 2026-04-25 00:57:05 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/12931
Author: @dhiltgen
Created: 11/3/2025
Status: Merged
Merged: 11/12/2025
Merged by: @dhiltgen

Base: mainHead: vulkan_opt_in


📝 Commits (4)

  • 38928f1 docs: vulkan information
  • f9e8db9 Revert "CI: Set up temporary opt-out Vulkan support (#12614)"
  • 9c01306 vulkan: temporary opt-in for Vulkan support
  • e8fe332 win: add vulkan CI build

📊 Changes

8 files changed (+84 additions, -46 deletions)

View changed files

📝 .github/workflows/release.yaml (+24 -14)
📝 .github/workflows/test.yaml (+1 -0)
📝 Dockerfile (+1 -26)
📝 discover/runner.go (+3 -0)
📝 docs/docker.mdx (+10 -0)
📝 docs/gpu.mdx (+41 -1)
📝 envconfig/config.go (+3 -3)
📝 scripts/build_linux.sh (+1 -2)

📄 Description

This will enable us to ship Vulkan in the official binaries, but require an opt-in setting to enable. Once we're confident it is stable, we can switch to enabled by default.

Draft while we work through a few remaining Vulkan topics:

  • Intel VRAM reporting done
  • Fix scheduling logic to not favor iGPUs with large system memory over "smaller" discrete GPUs.

Fixes #12848


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/12931 **Author:** [@dhiltgen](https://github.com/dhiltgen) **Created:** 11/3/2025 **Status:** ✅ Merged **Merged:** 11/12/2025 **Merged by:** [@dhiltgen](https://github.com/dhiltgen) **Base:** `main` ← **Head:** `vulkan_opt_in` --- ### 📝 Commits (4) - [`38928f1`](https://github.com/ollama/ollama/commit/38928f1a3b702737a8b3cbf4a921ff5404079c94) docs: vulkan information - [`f9e8db9`](https://github.com/ollama/ollama/commit/f9e8db96c266abc1d24d3a40ec027134d6a75348) Revert "CI: Set up temporary opt-out Vulkan support (#12614)" - [`9c01306`](https://github.com/ollama/ollama/commit/9c01306c3cfa741345988864d4c04d721de32ff0) vulkan: temporary opt-in for Vulkan support - [`e8fe332`](https://github.com/ollama/ollama/commit/e8fe33236aa180406a7d62d5191e758e561a1b5f) win: add vulkan CI build ### 📊 Changes **8 files changed** (+84 additions, -46 deletions) <details> <summary>View changed files</summary> 📝 `.github/workflows/release.yaml` (+24 -14) 📝 `.github/workflows/test.yaml` (+1 -0) 📝 `Dockerfile` (+1 -26) 📝 `discover/runner.go` (+3 -0) 📝 `docs/docker.mdx` (+10 -0) 📝 `docs/gpu.mdx` (+41 -1) 📝 `envconfig/config.go` (+3 -3) 📝 `scripts/build_linux.sh` (+1 -2) </details> ### 📄 Description This will enable us to ship Vulkan in the official binaries, but require an opt-in setting to enable. Once we're confident it is stable, we can switch to enabled by default. Draft while we work through a few remaining Vulkan topics: - ~~Intel VRAM reporting~~ done - ~~Fix scheduling logic to not favor iGPUs with large system memory over "smaller" discrete GPUs.~~ Fixes #12848 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-25 00:57:05 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#45248