[GH-ISSUE #13720] Cannot Upgrade Ollama on Windows 11 — Server Version Stuck at 0.11.0 Despite Client Update #71054

Closed
opened 2026-05-04 23:51:38 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @gismc123 on GitHub (Jan 14, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/13720

What is the issue?

System Info:

  • OS: Windows 11
  • RAM: 32GB DDR4
  • GPU: NVIDIA RTX 3060
  • Ollama Client Installed Version: 0.14.0
  • Ollama Server/Daemon Version: 0.11.0
  • Issue: Cannot pull models like gpt-oss:20b due to version mismatch

Description:
After multiple attempts, I am unable to get the Ollama server/daemon to upgrade on my Windows 11 PC. Although I can install the latest client (0.14.0), the server remains at 0.11.0. This prevents me from pulling certain models, such as gpt-oss:20b, with the following error:

pulling manifest
Error: pull model manifest: 412:
The model you are attempting to pull requires a newer version of Ollama.

Troubleshooting Steps Taken:

  1. Verified versions:
ollama --version

Output:

ollama version is 0.11.0
Warning: client version is 0.14.0
  1. Uninstalled Ollama via Windows “Apps & Features” → confirmed uninstall.

  2. Deleted Ollama folders manually:

  • C:\Users\--username--\AppData\Local\Programs\Ollama
  • C:\Users\--username--\.ollama
  1. Attempted deletions from Safe Mode and Windows Recovery Environment (WinRE) Command Prompt — deletion succeeded in both environments.

  2. Reinstalled Ollama fresh from https://ollama.com/download.

  3. After reinstall and restart, the version mismatch persists: server 0.11.0, client 0.14.0.

Observations:

  • The mismatch persists even after completely removing all known Ollama folders and reinstalling.

Goal:

  • Fully upgrade the Ollama server/daemon to match client version 0.14.0.
  • Successfully pull models like gpt-oss:20b.

Request:

  • Guidance on how to completely remove any hidden or locked Ollama server components on Windows 11.
  • Steps to ensure that after reinstall, the Ollama server/daemon version matches the latest client.

Relevant log output


OS

Windows

GPU

Nvidia

CPU

AMD

Ollama version

0.11.0

Originally created by @gismc123 on GitHub (Jan 14, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/13720 ### What is the issue? **System Info:** * OS: Windows 11 * RAM: 32GB DDR4 * GPU: NVIDIA RTX 3060 * Ollama Client Installed Version: 0.14.0 * Ollama Server/Daemon Version: 0.11.0 * Issue: Cannot pull models like `gpt-oss:20b` due to version mismatch **Description:** After multiple attempts, I am unable to get the Ollama server/daemon to upgrade on my Windows 11 PC. Although I can install the latest client (0.14.0), the server remains at 0.11.0. This prevents me from pulling certain models, such as `gpt-oss:20b`, with the following error: ``` pulling manifest Error: pull model manifest: 412: The model you are attempting to pull requires a newer version of Ollama. ``` **Troubleshooting Steps Taken:** 1. Verified versions: ```powershell ollama --version ``` Output: ``` ollama version is 0.11.0 Warning: client version is 0.14.0 ``` 2. Uninstalled Ollama via Windows “Apps & Features” → confirmed uninstall. 3. Deleted Ollama folders manually: * `C:\Users\--username--\AppData\Local\Programs\Ollama` * `C:\Users\--username--\.ollama` 4. Attempted deletions from **Safe Mode** and **Windows Recovery Environment (WinRE)** Command Prompt — deletion succeeded in both environments. 5. Reinstalled Ollama fresh from [https://ollama.com/download](https://ollama.com/download). 6. After reinstall and restart, the version mismatch persists: server 0.11.0, client 0.14.0. **Observations:** * The mismatch persists even after completely removing all known Ollama folders and reinstalling. **Goal:** * Fully upgrade the Ollama server/daemon to match client version 0.14.0. * Successfully pull models like `gpt-oss:20b`. **Request:** * Guidance on how to completely remove any hidden or locked Ollama server components on Windows 11. * Steps to ensure that after reinstall, the Ollama server/daemon version matches the latest client. ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.11.0
GiteaMirror added the bug label 2026-05-04 23:51:38 -05:00
Author
Owner

@rick-github commented on GitHub (Jan 15, 2026):

Have you previously installed ollama in WSL?

<!-- gh-comment-id:3753042780 --> @rick-github commented on GitHub (Jan 15, 2026): Have you previously installed ollama in WSL?
Author
Owner

@gismc123 commented on GitHub (Jan 15, 2026):

Have you previously installed ollama in WSL?

Its possible I have. But do not remember. How do i check?

<!-- gh-comment-id:3756567492 --> @gismc123 commented on GitHub (Jan 15, 2026): > Have you previously installed ollama in WSL? Its possible I have. But do not remember. How do i check?
Author
Owner

@gismc123 commented on GitHub (Jan 15, 2026):

Resolution – Root Cause Was Ollama Running Inside WSL (Ubuntu)

Thank you so much for the direction. I googled how to check WSL and found my response. I can't believe how stupid I feel after spending hours on this issue and almost considering nuking my OS and reinstalling Windows.

The problem was not the Windows install itself — I had an older Ollama installation running inside WSL (Ubuntu), and the Windows client was connecting to that daemon instead of the native Windows service.

How I solved it:

  1. Opened PowerShell.

  2. Started a WSL Ubuntu session.

  3. In the Ubuntu terminal, ran:

    ollama -v
    
  4. Output showed:

    ollama version 0.11.0
    

    This confirmed that an outdated Ollama daemon was running inside WSL.

How I fixed it (inside Ubuntu / WSL):

sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm $(which ollama)
sudo rm -r /usr/share/ollama
sudo userdel ollama

After removing Ollama from WSL and restarting, the Windows Ollama installation correctly reported the newer version and the client/server version mismatch was resolved. I am now able to pull models successfully.

Closing this issue.

<!-- gh-comment-id:3756796134 --> @gismc123 commented on GitHub (Jan 15, 2026): ### ✅ Resolution – Root Cause Was Ollama Running Inside WSL (Ubuntu) Thank you so much for the direction. I googled how to check WSL and found my response. I can't believe how stupid I feel after spending hours on this issue and almost considering nuking my OS and reinstalling Windows. The problem was not the Windows install itself — I had an older Ollama installation running inside WSL (Ubuntu), and the Windows client was connecting to that daemon instead of the native Windows service. **How I solved it:** 1. Opened PowerShell. 2. Started a WSL Ubuntu session. 3. In the Ubuntu terminal, ran: ``` ollama -v ``` 4. Output showed: ``` ollama version 0.11.0 ``` This confirmed that an outdated Ollama daemon was running inside WSL. **How I fixed it (inside Ubuntu / WSL):** ``` sudo systemctl stop ollama sudo systemctl disable ollama sudo rm $(which ollama) sudo rm -r /usr/share/ollama sudo userdel ollama ``` After removing Ollama from WSL and restarting, the Windows Ollama installation correctly reported the newer version and the client/server version mismatch was resolved. I am now able to pull models successfully. Closing this issue. ✅
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71054