[GH-ISSUE #12632] Trying to use qwen-coder:30b results with "The model you are attempting to pull requires a newer version of Ollama." #34144

Closed
opened 2026-04-22 17:27:52 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @futuremotiondev on GitHub (Oct 15, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12632

What is the issue?

I am using the latest version of Ollama and trying to use qwen-coder:30b gives me an error:

pull model manifest: 412: The model you are attempting to pull requires a newer version of Ollama. 
Please download the latest version at: https://ollama.com/download
Image

Relevant log output

Here are the log files from AppData\Local\Ollama:

app.log
app-3.log
app-5.log
server.log
server-4.log
server-5.log

OS

Windows

GPU

Nvidia

CPU

AMD

Ollama version

ollama version is 0.11.11, Warning: client version is 0.12.5

Originally created by @futuremotiondev on GitHub (Oct 15, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12632 ### What is the issue? I am using the latest version of Ollama and trying to use `qwen-coder:30b` gives me an error: ``` pull model manifest: 412: The model you are attempting to pull requires a newer version of Ollama. Please download the latest version at: https://ollama.com/download ``` <img width="1520" height="882" alt="Image" src="https://github.com/user-attachments/assets/c0c118cd-fbf0-436f-86ce-de12cc02513b" /> ### Relevant log output Here are the log files from `AppData\Local\Ollama`: [app.log](https://github.com/user-attachments/files/22924280/app.log) [app-3.log](https://github.com/user-attachments/files/22924278/app-3.log) [app-5.log](https://github.com/user-attachments/files/22924282/app-5.log) [server.log](https://github.com/user-attachments/files/22924279/server.log) [server-4.log](https://github.com/user-attachments/files/22924281/server-4.log) [server-5.log](https://github.com/user-attachments/files/22924283/server-5.log) ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version ollama version is 0.11.11, Warning: client version is 0.12.5
GiteaMirror added the bug label 2026-04-22 17:27:52 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 15, 2025):

Are you using WSL? Stop the ollama server running there.

<!-- gh-comment-id:3405834143 --> @rick-github commented on GitHub (Oct 15, 2025): Are you using WSL? Stop the ollama server running there.
Author
Owner

@shuntera commented on GitHub (Oct 21, 2025):

Do you also use Docker Desktop? It attaches to WSL for running its VM (optional) and it also has the capability now to run local LLMs. If I start my PC and do an ollama list I can see the 8 LLMs I have downloaded via Ollama. If I start my PC and go to DOcker Desktop first then run ollama list I just see 3 LLMs. I have tries to get an answer to this but hitting a brick wall, but they are related to this.

<!-- gh-comment-id:3428830864 --> @shuntera commented on GitHub (Oct 21, 2025): Do you also use Docker Desktop? It attaches to WSL for running its VM (optional) and it also has the capability now to run local LLMs. If I start my PC and do an ollama list I can see the 8 LLMs I have downloaded via Ollama. If I start my PC and go to DOcker Desktop first then run ollama list I just see 3 LLMs. I have tries to get an answer to this but hitting a brick wall, but they are related to this.
Author
Owner

@rick-github commented on GitHub (Oct 21, 2025):

You are running two different server, each with their own download location.

<!-- gh-comment-id:3428851343 --> @rick-github commented on GitHub (Oct 21, 2025): You are running two different server, each with their own download location.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#34144