[GH-ISSUE #5132] CANNOT DOWNLOAD MODELS #65271

Closed
opened 2026-05-03 20:16:21 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @Udacv on GitHub (Jun 19, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5132

What is the issue?

Recently, when I use 'ollama run' to download models, I cannot download anything with the bug following.
QQ截图20240619111403

Im from China, I cannot download either with the local Internet or with a VPN.

OS

Windows

GPU

AMD

CPU

AMD

Ollama version

0.1.44

Originally created by @Udacv on GitHub (Jun 19, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5132 ### What is the issue? Recently, when I use 'ollama run' to download models, I cannot download anything with the bug following. ![QQ截图20240619111403](https://github.com/ollama/ollama/assets/126667614/a4465567-74aa-4869-b12d-6b6d7d5701ea) Im from China, I cannot download either with the local Internet or with a VPN. ### OS Windows ### GPU AMD ### CPU AMD ### Ollama version 0.1.44
GiteaMirror added the bug label 2026-05-03 20:16:21 -05:00
Author
Owner

@Udacv commented on GitHub (Jun 19, 2024):

WHO CAN HELP ME /(ㄒoㄒ)/~~

<!-- gh-comment-id:2177466991 --> @Udacv commented on GitHub (Jun 19, 2024): WHO CAN HELP ME /(ㄒoㄒ)/~~
Author
Owner

@AncientMystic commented on GitHub (Jun 19, 2024):

Not directly related to your bug but you could use open-webui and either download the models manually from ollama or download gguf files from huggingface and just upload the files manually via the open-webui instance (easy to deploy with docker)

Id also suggest making sure of course that a firewall such as the built in windows firewall isn't stopping the download of models via ollama

<!-- gh-comment-id:2177485481 --> @AncientMystic commented on GitHub (Jun 19, 2024): Not directly related to your bug but you could use open-webui and either download the models manually from ollama or download gguf files from huggingface and just upload the files manually via the open-webui instance (easy to deploy with docker) Id also suggest making sure of course that a firewall such as the built in windows firewall isn't stopping the download of models via ollama
Author
Owner

@Udacv commented on GitHub (Jun 19, 2024):

Not directly related to your bug but you could use open-webui and either download the models manually from ollama or download gguf files from huggingface and just upload the files manually via the open-webui instance (easy to deploy with docker)

Id also suggest making sure of course that a firewall such as the built in windows firewall isn't stopping the download of models via ollama

Thanks, I will try. (●'◡'●)

<!-- gh-comment-id:2177491464 --> @Udacv commented on GitHub (Jun 19, 2024): > Not directly related to your bug but you could use open-webui and either download the models manually from ollama or download gguf files from huggingface and just upload the files manually via the open-webui instance (easy to deploy with docker) > > Id also suggest making sure of course that a firewall such as the built in windows firewall isn't stopping the download of models via ollama Thanks, I will try. (●'◡'●)
Author
Owner

@AncientMystic commented on GitHub (Jun 19, 2024):

Not directly related to your bug but you could use open-webui and either download the models manually from ollama or download gguf files from huggingface and just upload the files manually via the open-webui instance (easy to deploy with docker)

Id also suggest making sure of course that a firewall such as the built in windows firewall isn't stopping the download of models via ollama

Thanks, I will try. (●'◡'●)

You're welcome, hopefully that works for you, i always just use links from huggingface or upload models downloaded from it, theres a much wider selection there anyways just search whatever name plus gguf and you will find a ton of ollama compatible models to use.

<!-- gh-comment-id:2177516369 --> @AncientMystic commented on GitHub (Jun 19, 2024): > > Not directly related to your bug but you could use open-webui and either download the models manually from ollama or download gguf files from huggingface and just upload the files manually via the open-webui instance (easy to deploy with docker) > > > > > > Id also suggest making sure of course that a firewall such as the built in windows firewall isn't stopping the download of models via ollama > > > > Thanks, I will try. (●'◡'●) You're welcome, hopefully that works for you, i always just use links from huggingface or upload models downloaded from it, theres a much wider selection there anyways just search whatever name plus gguf and you will find a ton of ollama compatible models to use.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#65271