[GH-ISSUE #8498] Network issues with pulling model from ollama #5476

Closed
opened 2026-04-12 16:42:22 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @JB-Bryant on GitHub (Jan 20, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8498

What is the issue?

When I trying to pull model from ollama by proxy whatever large model and small model(2g),it seems that redownload again and again and again, just like from 120mb to 160mb and go back to 100mb again and again and finally show "max retries exceeded".This issue persists probably 1 week and it worked well before.I already changed different network to try it but it still not work.:(
Thanks for ur assistant!

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.5.7

Originally created by @JB-Bryant on GitHub (Jan 20, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8498 ### What is the issue? When I trying to pull model from ollama by proxy whatever large model and small model(2g),it seems that redownload again and again and again, just like from 120mb to 160mb and go back to 100mb again and again and finally show "max retries exceeded".This issue persists probably 1 week and it worked well before.I already changed different network to try it but it still not work.:( Thanks for ur assistant! ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.5.7
GiteaMirror added the bug label 2026-04-12 16:42:22 -05:00
Author
Owner

@rick-github commented on GitHub (Jan 20, 2025):

https://github.com/ollama/ollama/issues/8484

<!-- gh-comment-id:2601780218 --> @rick-github commented on GitHub (Jan 20, 2025): https://github.com/ollama/ollama/issues/8484
Author
Owner

@Willian7004 commented on GitHub (Jan 21, 2025):

I have a similar problem on Windows.

<!-- gh-comment-id:2603598503 --> @Willian7004 commented on GitHub (Jan 21, 2025): I have a similar problem on Windows.
Author
Owner

@Willian7004 commented on GitHub (Jan 21, 2025):

I have a similar problem on Windows.

I used v0.5.7 when I found the problem. I downloaded model successfully 10 days ago with v0.5.5 but I tried v0.5.5 just now and got the same problem. Maybe the error was caused by server.

<!-- gh-comment-id:2603629605 --> @Willian7004 commented on GitHub (Jan 21, 2025): > I have a similar problem on Windows. I used v0.5.7 when I found the problem. I downloaded model successfully 10 days ago with v0.5.5 but I tried v0.5.5 just now and got the same problem. Maybe the error was caused by server.
Author
Owner

@JB-Bryant commented on GitHub (Jan 21, 2025):

I have a similar problem on Windows.

I used v0.5.7 when I found the problem. I downloaded model successfully 10 days ago with v0.5.5 but I tried v0.5.5 just now and got the same problem. Maybe the error was caused by server.
Thx for ur reply! Did u use proxy download model too?I have no downloaded any model for about 2 months.This problem happened in 0.5.6 & 0.5.7 in my mac, I though it was because the multi-threaded download didn't merge right before but now it seems the server problems:(

<!-- gh-comment-id:2603729852 --> @JB-Bryant commented on GitHub (Jan 21, 2025): > > I have a similar problem on Windows. > > I used v0.5.7 when I found the problem. I downloaded model successfully 10 days ago with v0.5.5 but I tried v0.5.5 just now and got the same problem. Maybe the error was caused by server. Thx for ur reply! Did u use proxy download model too?I have no downloaded any model for about 2 months.This problem happened in 0.5.6 & 0.5.7 in my mac, I though it was because the multi-threaded download didn't merge right before but now it seems the server problems:(
Author
Owner

@JB-Bryant commented on GitHub (Jan 21, 2025):

#8484

Thx!

<!-- gh-comment-id:2603730063 --> @JB-Bryant commented on GitHub (Jan 21, 2025): > [#8484](https://github.com/ollama/ollama/issues/8484) Thx!
Author
Owner

@MSR2201 commented on GitHub (Jan 23, 2025):

apparantly a lot of people are getting this error what is the solution for this ?

<!-- gh-comment-id:2609593322 --> @MSR2201 commented on GitHub (Jan 23, 2025): apparantly a lot of people are getting this error what is the solution for this ?
Author
Owner

@JB-Bryant commented on GitHub (Jan 24, 2025):

apparantly a lot of people are getting this error what is the solution for this ?

https://github.com/ollama/ollama/issues/8484#issuecomment-2608929050
Hey! Probably this bro's method is a solution! I have successfully downloaded the deepseek-r1-1.5b ! This script works well in my computer, but the deepseek-r1-1.5b is a stupid model :), I will try to download a larger model later.

<!-- gh-comment-id:2611488211 --> @JB-Bryant commented on GitHub (Jan 24, 2025): > apparantly a lot of people are getting this error what is the solution for this ? https://github.com/ollama/ollama/issues/8484#issuecomment-2608929050 Hey! Probably this bro's method is a solution! I have successfully downloaded the deepseek-r1-1.5b ! This script works well in my computer, but the deepseek-r1-1.5b is a stupid model :), I will try to download a larger model later.
Author
Owner

@rick-github commented on GitHub (Mar 4, 2025):

This should be mitigated as of 0.5.8 by #8831 and #9294 provides an overhaul of model pulling, so closing but feel free to add updates if you are still having issues.

<!-- gh-comment-id:2698051849 --> @rick-github commented on GitHub (Mar 4, 2025): This should be mitigated as of 0.5.8 by #8831 and #9294 provides an overhaul of model pulling, so closing but feel free to add updates if you are still having issues.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5476