[PR #14081] MSI Based windows install with "curl" style install script. #76798

Open
opened 2026-05-05 09:28:31 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/14081
Author: @dhiltgen
Created: 2/5/2026
Status: 🔄 Open

Base: mainHead: msi


📝 Commits (1)

  • 7538586 MSI curl style - phase 1 implementation

📊 Changes

26 files changed (+3678 additions, -74 deletions)

View changed files

📝 CMakeLists.txt (+18 -0)
📝 app/README.md (+6 -42)
📝 app/cmd/app/app.go (+10 -2)
app/msi/CMakeLists.txt (+421 -0)
app/msi/Folders.wxs (+20 -0)
app/msi/chained-uninstall.ps1 (+26 -0)
app/msi/cuda-v12.wxs (+23 -0)
app/msi/cuda-v13.wxs (+23 -0)
app/msi/dependencies/Folders.wxs (+20 -0)
app/msi/dependencies/cuda-v12-deps.wxs (+21 -0)
app/msi/dependencies/cuda-v13-deps.wxs (+21 -0)
app/msi/dependencies/generate-backend-deps.ps1 (+222 -0)
app/msi/dependencies/rocm-deps.wxs (+21 -0)
app/msi/dependencies/vulkan-deps.wxs (+21 -0)
app/msi/generate-packages-json.ps1 (+140 -0)
app/msi/ollama-core-arm64.wxs (+117 -0)
app/msi/ollama-core.wxs (+130 -0)
app/msi/rocm.wxs (+23 -0)
app/msi/vulkan.wxs (+23 -0)
📝 app/server/server.go (+26 -7)

...and 6 more files

📄 Description

This implements a new MSI based installer flow for Windows. The components are broken down into multiple MSIs. A minimal core MSI which contains just Ollama for CPU (or cloud) inference. Each GPU backend is split into 2 MSIs - one for the dependency libraries that rarely change, and one for the Ollama libraries for that GPU backend type. This decoupling allows more efficient upgrade flows where dependencies can be omitted for most releases as they are unchanged.

A new PowerShell based "curl style" install/upgrade script is now available which handles installing the MSIs. By default it will discover the GPUs in the users system and install the applicable components. An optional -Minimal variation installs just CPU/Cloud support, and -All installs all GPU backends. After this ships, install becomes as simple as:

irm https://ollama.com/install.ps1 | iex

The MSIs should be suitable for downstream packaging in tools like winget, choco, etc. as well as mass deployment tooling.

This PR retains the InnoSetup installer and leaves the App upgrade flow wired to that OllamaSetup.exe. In a future PR we can add logic in the app to support both modes of upgrading and deprecate the OllamaSetup.exe installer, then follow up with a final change to switch entirely to MSI based.

Example sizes from this branch:

 16M Feb  4 17:36 dist/ollama-core-arm64.msi
 18M Feb  4 17:36 dist/ollama-core.msi
469M Feb  4 17:29 dist/ollama-cuda-deps-12.8.61.msi
340M Feb  4 17:36 dist/ollama-cuda-v12.msi
370M Feb  4 17:29 dist/ollama-cuda-v13-deps-13.0.48.msi
104M Feb  4 17:30 dist/ollama-cuda-v13.msi
 79M Feb  4 17:29 dist/ollama-rocm-deps-6.2.41512.msi
 33M Feb  4 17:31 dist/ollama-rocm.msi
392K Feb  4 17:29 dist/ollama-vulkan-deps-1.4.321.msi
6.6M Feb  4 17:30 dist/ollama-vulkan.msi

Marking draft for now as additional CI work is required before we merge, and the doc updates should be split into a follow-up PR to avoid confusion prior to a shipped release.

Fixes #8005
Fixes #10514
Fixes #10270
Fixes #10624
Fixes #10227
Partial #12268
Enables #5880


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/14081 **Author:** [@dhiltgen](https://github.com/dhiltgen) **Created:** 2/5/2026 **Status:** 🔄 Open **Base:** `main` ← **Head:** `msi` --- ### 📝 Commits (1) - [`7538586`](https://github.com/ollama/ollama/commit/7538586acd6e72840867e36b1cc1dd6f15a80581) MSI curl style - phase 1 implementation ### 📊 Changes **26 files changed** (+3678 additions, -74 deletions) <details> <summary>View changed files</summary> 📝 `CMakeLists.txt` (+18 -0) 📝 `app/README.md` (+6 -42) 📝 `app/cmd/app/app.go` (+10 -2) ➕ `app/msi/CMakeLists.txt` (+421 -0) ➕ `app/msi/Folders.wxs` (+20 -0) ➕ `app/msi/chained-uninstall.ps1` (+26 -0) ➕ `app/msi/cuda-v12.wxs` (+23 -0) ➕ `app/msi/cuda-v13.wxs` (+23 -0) ➕ `app/msi/dependencies/Folders.wxs` (+20 -0) ➕ `app/msi/dependencies/cuda-v12-deps.wxs` (+21 -0) ➕ `app/msi/dependencies/cuda-v13-deps.wxs` (+21 -0) ➕ `app/msi/dependencies/generate-backend-deps.ps1` (+222 -0) ➕ `app/msi/dependencies/rocm-deps.wxs` (+21 -0) ➕ `app/msi/dependencies/vulkan-deps.wxs` (+21 -0) ➕ `app/msi/generate-packages-json.ps1` (+140 -0) ➕ `app/msi/ollama-core-arm64.wxs` (+117 -0) ➕ `app/msi/ollama-core.wxs` (+130 -0) ➕ `app/msi/rocm.wxs` (+23 -0) ➕ `app/msi/vulkan.wxs` (+23 -0) 📝 `app/server/server.go` (+26 -7) _...and 6 more files_ </details> ### 📄 Description This implements a new MSI based installer flow for Windows. The components are broken down into multiple MSIs. A minimal core MSI which contains just Ollama for CPU (or cloud) inference. Each GPU backend is split into 2 MSIs - one for the dependency libraries that rarely change, and one for the Ollama libraries for that GPU backend type. This decoupling allows more efficient upgrade flows where dependencies can be omitted for most releases as they are unchanged. A new PowerShell based "curl style" install/upgrade script is now available which handles installing the MSIs. By default it will discover the GPUs in the users system and install the applicable components. An optional -Minimal variation installs just CPU/Cloud support, and -All installs all GPU backends. After this ships, install becomes as simple as: ``` irm https://ollama.com/install.ps1 | iex ``` The MSIs should be suitable for downstream packaging in tools like winget, choco, etc. as well as mass deployment tooling. This PR retains the InnoSetup installer and leaves the App upgrade flow wired to that OllamaSetup.exe. In a future PR we can add logic in the app to support both modes of upgrading and deprecate the OllamaSetup.exe installer, then follow up with a final change to switch entirely to MSI based. Example sizes from this branch: ``` 16M Feb 4 17:36 dist/ollama-core-arm64.msi 18M Feb 4 17:36 dist/ollama-core.msi 469M Feb 4 17:29 dist/ollama-cuda-deps-12.8.61.msi 340M Feb 4 17:36 dist/ollama-cuda-v12.msi 370M Feb 4 17:29 dist/ollama-cuda-v13-deps-13.0.48.msi 104M Feb 4 17:30 dist/ollama-cuda-v13.msi 79M Feb 4 17:29 dist/ollama-rocm-deps-6.2.41512.msi 33M Feb 4 17:31 dist/ollama-rocm.msi 392K Feb 4 17:29 dist/ollama-vulkan-deps-1.4.321.msi 6.6M Feb 4 17:30 dist/ollama-vulkan.msi ``` Marking draft for now as additional CI work is required before we merge, and the doc updates should be split into a follow-up PR to avoid confusion prior to a shipped release. Fixes #8005 Fixes #10514 Fixes #10270 Fixes #10624 Fixes #10227 Partial #12268 Enables #5880 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-05-05 09:28:31 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#76798