[GH-ISSUE #528] 416 response when pulling a model #244

Closed
opened 2026-04-12 09:45:54 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @codazoda on GitHub (Sep 14, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/528

I'm getting the following error when I try to pull the lllama2-uncensored model.

$ollama pull llama2-uncensored
pulling manifest
Error: download failed: on download registry responded with code 416:

This might be a registry problem or a problem with the model I'm pulling. I'm not really sure the appropriate place to report the error. It's also entirely possible this is something to do with my internet connection, as I'm currently traveling.

Pulling llama2 works fine. Pulling llama2-uncensored:7b works fine (which should be the same thing, I think).

Originally created by @codazoda on GitHub (Sep 14, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/528 I'm getting the following error when I try to pull the lllama2-uncensored model. ``` $ollama pull llama2-uncensored pulling manifest Error: download failed: on download registry responded with code 416: ``` This might be a registry problem or a problem with the model I'm pulling. I'm not really sure the appropriate place to report the error. It's also entirely possible this is something to do with my internet connection, as I'm currently traveling. Pulling llama2 works fine. Pulling llama2-uncensored:7b works fine (which should be the same thing, I think).
Author
Owner

@BruceMacD commented on GitHub (Sep 14, 2023):

Thanks for reporting this.

This error (416) indicates that the resumable model download is trying to continue when the file is already fully downloaded. The download was probably corrupted by the unstable connection. To fix the problem in this case try updating to the most recent version of Ollama, it should purge the partially downloaded file and let you try again.

I'm going to leave this issue open because the error should be handled better than just blocking you from downloading.

<!-- gh-comment-id:1719492372 --> @BruceMacD commented on GitHub (Sep 14, 2023): Thanks for reporting this. This error (416) indicates that the resumable model download is trying to continue when the file is already fully downloaded. The download was probably corrupted by the unstable connection. To fix the problem in this case try updating to the most recent version of Ollama, it should purge the partially downloaded file and let you try again. I'm going to leave this issue open because the error should be handled better than just blocking you from downloading.
Author
Owner

@TahaScripts commented on GitHub (Sep 16, 2023):

@BruceMacD could you point me to where the downloading is happening? I presume the error handling is within the pull function defined within /main/api/client.py; Seems like a good beginner contribution for me

<!-- gh-comment-id:1722110023 --> @TahaScripts commented on GitHub (Sep 16, 2023): @BruceMacD could you point me to where the downloading is happening? I presume the error handling is within the `pull` function defined within [/main/api/client.py](https://github.com/jmorganca/ollama/blob/main/api/client.py); Seems like a good beginner contribution for me
Author
Owner

@BruceMacD commented on GitHub (Sep 18, 2023):

@TahaScripts
The download is happening in server/download.go. The root issue here could be a bit tough, I think the solution will be doing checksums on smaller chunks while being the file is being downloaded.

<!-- gh-comment-id:1723583963 --> @BruceMacD commented on GitHub (Sep 18, 2023): @TahaScripts The download is happening in `server/download.go`. The root issue here could be a bit tough, I think the solution will be doing checksums on smaller chunks while being the file is being downloaded.
Author
Owner

@mchiang0610 commented on GitHub (Sep 30, 2023):

@codazoda thank you for creating this issue! This should be resolved.

@TahaScripts Thank you for offering to help as well! Really appreciate it. We'll do more improvements on this to increase download/upload speeds to ollama library -- including the chunking

<!-- gh-comment-id:1741680218 --> @mchiang0610 commented on GitHub (Sep 30, 2023): @codazoda thank you for creating this issue! This should be resolved. @TahaScripts Thank you for offering to help as well! Really appreciate it. We'll do more improvements on this to increase download/upload speeds to ollama library -- including the chunking
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#244