[GH-ISSUE #1158] max retries exceeded: unexpected EOF #587

Closed
opened 2026-04-12 10:17:21 -05:00 by GiteaMirror · 15 comments
Owner

Originally created by @priamai on GitHub (Nov 16, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1158

Originally assigned to: @BruceMacD on GitHub.

Hi there,
I am not sure if this is related to your file service, but I am getting this connection drops out very often.

Screenshot from 2023-11-16 23-45-54

Maybe there is a way to throttle requests?

Originally created by @priamai on GitHub (Nov 16, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1158 Originally assigned to: @BruceMacD on GitHub. Hi there, I am not sure if this is related to your file service, but I am getting this connection drops out very often. ![Screenshot from 2023-11-16 23-45-54](https://github.com/jmorganca/ollama/assets/57333254/d530f24e-af82-49d8-9435-0653922d1eec) Maybe there is a way to throttle requests?
GiteaMirror added the bug label 2026-04-12 10:17:21 -05:00
Author
Owner

@orlyandico commented on GitHub (Nov 17, 2023):

Just keep re-running the pull, it will resume from where it left off. I encounter it a lot because I only have a 200Mbps connection and had three PC's downloading models at one time... (it would be more efficient to download once and then SCP the blobs to the other PC's).

<!-- gh-comment-id:1815588408 --> @orlyandico commented on GitHub (Nov 17, 2023): Just keep re-running the pull, it will resume from where it left off. I encounter it a lot because I only have a 200Mbps connection and had three PC's downloading models at one time... (it would be more efficient to download once and then SCP the blobs to the other PC's).
Author
Owner

@igorschlum commented on GitHub (Nov 17, 2023):

I have this problem also. More often when my internet connection was poor. In some case I had to relaunch the pull more than 20 times. It will become an issue if a model is updated and if Ollama runs as a stand alone server. Is it possible to extend the number of retries?

<!-- gh-comment-id:1815728236 --> @igorschlum commented on GitHub (Nov 17, 2023): I have this problem also. More often when my internet connection was poor. In some case I had to relaunch the pull more than 20 times. It will become an issue if a model is updated and if Ollama runs as a stand alone server. Is it possible to extend the number of retries?
Author
Owner

@priamai commented on GitHub (Nov 17, 2023):

I have a fiber connection here with 300 Mbps, I believe the issue is where the file is downloaded from. What about putting those files on a more reliable service like AWS S3?

<!-- gh-comment-id:1815903150 --> @priamai commented on GitHub (Nov 17, 2023): I have a fiber connection here with 300 Mbps, I believe the issue is where the file is downloaded from. What about putting those files on a more reliable service like AWS S3?
Author
Owner

@igorschlum commented on GitHub (Nov 17, 2023):

You raise a good point, @priamai. Having a better bandwidth requires a better hosting solution, which is not free, but as Ollama grows, this issue will become more and more important. Ollama teams could look for a sponsor who would pay for the hosting.

<!-- gh-comment-id:1816028235 --> @igorschlum commented on GitHub (Nov 17, 2023): You raise a good point, @priamai. Having a better bandwidth requires a better hosting solution, which is not free, but as Ollama grows, this issue will become more and more important. Ollama teams could look for a sponsor who would pay for the hosting.
Author
Owner

@orlyandico commented on GitHub (Nov 17, 2023):

I also tested this on AWS and Hyperstack instances, and didn't get the EOF retries. So it appears to be down to one's internet connection.

My more pressing question is why does Ollama have its own model repository, rather than supporting downloading models from, say, HuggingFace.

<!-- gh-comment-id:1816121142 --> @orlyandico commented on GitHub (Nov 17, 2023): I also tested this on AWS and Hyperstack instances, and didn't get the EOF retries. So it appears to be down to one's internet connection. My more pressing question is why does Ollama have its own model repository, rather than supporting downloading models from, say, HuggingFace.
Author
Owner

@jmorganca commented on GitHub (Nov 17, 2023):

Hi folks thanks for creating an issue – looking into this – there really shouldn't be any EOF errors

<!-- gh-comment-id:1816311884 --> @jmorganca commented on GitHub (Nov 17, 2023): Hi folks thanks for creating an issue – looking into this – there really shouldn't be any EOF errors
Author
Owner

@priamai commented on GitHub (Nov 17, 2023):

We have some spare capacity on Aws S3 and also happy to sponsor. Send me a PM if you like. cheers.

<!-- gh-comment-id:1816420074 --> @priamai commented on GitHub (Nov 17, 2023): We have some spare capacity on Aws S3 and also happy to sponsor. Send me a PM if you like. cheers.
Author
Owner

@BruceMacD commented on GitHub (Mar 11, 2024):

Thanks for the report here. The bug shown in this issue specifically is now fixed as we run models directly rather than in a subprocess. Although there are still on EOF errors. If anyone else sees an EOF please open a new issue so we can triage it appropriately.

<!-- gh-comment-id:1989152214 --> @BruceMacD commented on GitHub (Mar 11, 2024): Thanks for the report here. The bug shown in this issue specifically is now fixed as we run models directly rather than in a subprocess. Although there are still on EOF errors. If anyone else sees an EOF please open a new issue so we can triage it appropriately.
Author
Owner

@MADAO-LUV commented on GitHub (Nov 12, 2024):

i also have this the same issue.And how i can solve this problem.i try reload for three times.

<!-- gh-comment-id:2469587643 --> @MADAO-LUV commented on GitHub (Nov 12, 2024): i also have this the same issue.And how i can solve this problem.i try reload for three times.
Author
Owner

@MichelMichels commented on GitHub (Jan 28, 2025):

Just reporting that I had this issue multiple times while trying to download deepseek-r1:7b.

EDIT: Currently stuck at 4.1 GB/4.7 GB

<!-- gh-comment-id:2618769939 --> @MichelMichels commented on GitHub (Jan 28, 2025): Just reporting that I had this issue multiple times while trying to download deepseek-r1:7b. EDIT: Currently stuck at 4.1 GB/4.7 GB
Author
Owner

@LeGao-HIT commented on GitHub (Jan 28, 2025):

Just reporting that I had this issue multiple times while trying to download deepseek-r1:7b

Agree, I also encountered the same problem when downloading deepseek-r1:7b: Error: max retries exceeded: EOF

<!-- gh-comment-id:2618775772 --> @LeGao-HIT commented on GitHub (Jan 28, 2025): > Just reporting that I had this issue multiple times while trying to download deepseek-r1:7b Agree, I also encountered the same problem when downloading deepseek-r1:7b: Error: max retries exceeded: EOF
Author
Owner

@muhamedljatifi commented on GitHub (Jan 28, 2025):

yes have the same issue with the same model. other models work fine

<!-- gh-comment-id:2618830947 --> @muhamedljatifi commented on GitHub (Jan 28, 2025): yes have the same issue with the same model. other models work fine
Author
Owner

@MengyangGao commented on GitHub (Jan 28, 2025):

same problem when ollama run deepseek-r1 which is from https://ollama.com/library/deepseek-r1

<!-- gh-comment-id:2619007726 --> @MengyangGao commented on GitHub (Jan 28, 2025): same problem when `ollama run deepseek-r1` which is from https://ollama.com/library/deepseek-r1
Author
Owner

@youkuzazazazaza commented on GitHub (Jan 28, 2025):

是的,同一型号有同样的问题。其他型号运行良好

yes have the same issue with the same model. other models work fine

i change to deepseek-r1:8b,it works well

<!-- gh-comment-id:2619229984 --> @youkuzazazazaza commented on GitHub (Jan 28, 2025): > 是的,同一型号有同样的问题。其他型号运行良好 > yes have the same issue with the same model. other models work fine i change to deepseek-r1:8b,it works well
Author
Owner

@micokonsep commented on GitHub (Jan 28, 2025):

try this.

https://medium.com/@micowendy/ollama-error-max-retries-exceeded-eof-c8baf4a57ffc
it;s in Indonesian language, but you can copy paste the code.

regards,

<!-- gh-comment-id:2619445783 --> @micokonsep commented on GitHub (Jan 28, 2025): try this. https://medium.com/@micowendy/ollama-error-max-retries-exceeded-eof-c8baf4a57ffc it;s in Indonesian language, but you can copy paste the code. regards,
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#587