[GH-ISSUE #13362] False error: Unable to download model. Please check your internet connection to download the model for offline use #8824

Closed
opened 2026-04-12 21:36:27 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @davidgilbertson on GitHub (Dec 6, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13362

What is the issue?

In the UI, I pick a model I haven't downloaded and start a chat. Instead of downloading, I get the error: "Unable to download model. Please check your internet connection to download the model for offline use".

I've got fast fibre, no network issues, so whatever logic leads to that error isn't actually checking whether or not I've got an internet connection.

Relevant log output


OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.13.1

Originally created by @davidgilbertson on GitHub (Dec 6, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13362 ### What is the issue? In the UI, I pick a model I haven't downloaded and start a chat. Instead of downloading, I get the error: "Unable to download model. Please check your internet connection to download the model for offline use". I've got fast fibre, no network issues, so whatever logic leads to that error isn't actually checking whether or not I've got an internet connection. ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.13.1
GiteaMirror added the bug label 2026-04-12 21:36:27 -05:00
Author
Owner

@rick-github commented on GitHub (Dec 7, 2025):

Open a CMD window and run ollama pull <model-name>. What's the result?

<!-- gh-comment-id:3621401363 --> @rick-github commented on GitHub (Dec 7, 2025): Open a CMD window and run `ollama pull <model-name>`. What's the result?
Author
Owner

@davidgilbertson commented on GitHub (Dec 7, 2025):

Since posting the issue, I had run ollama --version in a terminal (if that's relevant) then back in the UI opened a new chat and tried again and that worked. So it's all working now, but still, seems like something is wrong in the logic that leads to that error. It would be extremely unusual that I actually had a network issue at the time (I'd just been on ollama.com to get the model name, had music streaming, etc)

<!-- gh-comment-id:3621476510 --> @davidgilbertson commented on GitHub (Dec 7, 2025): Since posting the issue, I had run `ollama --version` in a terminal (if that's relevant) then back in the UI opened a new chat and tried again and that worked. So it's all working now, but still, seems like something is wrong in the logic that leads to that error. It would be extremely unusual that I actually had a network issue at the time (I'd just been on ollama.com to get the model name, had music streaming, etc)
Author
Owner

@rick-github commented on GitHub (Dec 7, 2025):

Could have been a transient issue anywhere between your machine and the server hosting the model, the server log may have more information. Closing for now but feel free to add comments/re-open if the issue happens again.

<!-- gh-comment-id:3622010137 --> @rick-github commented on GitHub (Dec 7, 2025): Could have been a transient issue anywhere between your machine and the server hosting the model, the [server log](https://docs.ollama.com/troubleshooting) may have more information. Closing for now but feel free to add comments/re-open if the issue happens again.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8824