[GH-ISSUE #8769] Model download fails at very last step 404GB DeepSeek 671b #67748

Open
opened 2026-05-04 11:34:03 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @kungfu-eric on GitHub (Feb 2, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8769

What is the issue?

ollama run deepseek-r1:671b
pulling manifest
pulling manifest
pulling manifest
pulling manifest
pulling manifest
pulling manifest
pulling 9801e7fce27d... 100% __________________________________________________________________________________________________________________
_ 404 GB
Error: rename /root/.ollama/models/blobs/sha256-9801e7fce27dbf3d0bfb468b7b21f1d132131a546dfc43e50518631b8b1800a9-partial /root/.ollama/models/b
lobs/sha256-9801e7fce27dbf3d0bfb468b7b21f1d132131a546dfc43e50518631b8b1800a9: no such file or directory

Tried it several times and now out of quota from ISP! 70b and 32b and others work mostly fine (had to pull 32b twice)
It's definitely download because the disk is filling up over the 2 hour download (is there a faster way to download?)

OS

Linux

GPU

Nvidia

CPU

AMD

Ollama version

0.3.12

Originally created by @kungfu-eric on GitHub (Feb 2, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8769 ### What is the issue? ``` ollama run deepseek-r1:671b pulling manifest pulling manifest pulling manifest pulling manifest pulling manifest pulling manifest pulling 9801e7fce27d... 100% __________________________________________________________________________________________________________________ _ 404 GB Error: rename /root/.ollama/models/blobs/sha256-9801e7fce27dbf3d0bfb468b7b21f1d132131a546dfc43e50518631b8b1800a9-partial /root/.ollama/models/b lobs/sha256-9801e7fce27dbf3d0bfb468b7b21f1d132131a546dfc43e50518631b8b1800a9: no such file or directory ``` Tried it several times and now out of quota from ISP! 70b and 32b and others work mostly fine (had to pull 32b twice) It's definitely download because the disk is filling up over the 2 hour download (is there a faster way to download?) ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.3.12
GiteaMirror added the bug label 2026-05-04 11:34:03 -05:00
Author
Owner

@jmorganca commented on GitHub (Feb 2, 2025):

I'm so sorry this happened. Will look into it. In the meantime, make sure to upgrade to the latest version of Ollama (0.5.7)

<!-- gh-comment-id:2629203129 --> @jmorganca commented on GitHub (Feb 2, 2025): I'm so sorry this happened. Will look into it. In the meantime, make sure to upgrade to the latest version of Ollama (0.5.7)
Author
Owner

@kungfu-eric commented on GitHub (Feb 3, 2025):

Thanks. I upgraded and ran pull instead and still no dice. It says the hash is wrong which is bizarre.

ollama pull deepseek-r1:671b
pulling manifest
pulling 9801e7fce27d... 100% ___________________________________________________________________________________________________________________ 404 GB

pulling 369ca498f347... 100% ___________________________________________________________________________________________________________________  387 B

pulling 6e4c38e1172f... 100% ___________________________________________________________________________________________________________________ 1.1 KB

pulling f4d24e9138dd... 100% ___________________________________________________________________________________________________________________  148 B

pulling fdf3d6cb73c7... 100% ___________________________________________________________________________________________________________________  497 B

verifying sha256 digest
Error: digest mismatch, file must be downloaded again: want sha256:9801e7fce27dbf3d0bfb468b7b21f1d132131a546dfc43e50518631b8b1800a9, got sha256:93b6a4f11af22b4f7a2a7d545100ef4adbd7bd22d09b969f7954aefae0dd9a3b

Could there really be a corrupted bit 3x in a row? Maybe the weights don't have to be that perfect?
Edit: yea this model really needs to be split into more manageable sizes that can be individually hashed..

<!-- gh-comment-id:2629678986 --> @kungfu-eric commented on GitHub (Feb 3, 2025): Thanks. I upgraded and ran pull instead and still no dice. It says the hash is wrong which is bizarre. ``` ollama pull deepseek-r1:671b pulling manifest pulling 9801e7fce27d... 100% ___________________________________________________________________________________________________________________ 404 GB pulling 369ca498f347... 100% ___________________________________________________________________________________________________________________ 387 B pulling 6e4c38e1172f... 100% ___________________________________________________________________________________________________________________ 1.1 KB pulling f4d24e9138dd... 100% ___________________________________________________________________________________________________________________ 148 B pulling fdf3d6cb73c7... 100% ___________________________________________________________________________________________________________________ 497 B verifying sha256 digest Error: digest mismatch, file must be downloaded again: want sha256:9801e7fce27dbf3d0bfb468b7b21f1d132131a546dfc43e50518631b8b1800a9, got sha256:93b6a4f11af22b4f7a2a7d545100ef4adbd7bd22d09b969f7954aefae0dd9a3b ``` Could there really be a corrupted bit 3x in a row? Maybe the weights don't have to be that perfect? Edit: yea this model really needs to be split into more manageable sizes that can be individually hashed..
Author
Owner

@jmorganca commented on GitHub (Feb 3, 2025):

@kungfu-eric sorry about that. We're working on improving this so the file is split up - indeed, a single corrupted bit can cause issues.

<!-- gh-comment-id:2629745455 --> @jmorganca commented on GitHub (Feb 3, 2025): @kungfu-eric sorry about that. We're working on improving this so the file is split up - indeed, a single corrupted bit can cause issues.
Author
Owner

@kungfu-eric commented on GitHub (Feb 4, 2025):

https://github.com/ollama/ollama/issues/5245 this is the relevant issue for workaround sharded gguf, quite the old issue..

<!-- gh-comment-id:2635162485 --> @kungfu-eric commented on GitHub (Feb 4, 2025): https://github.com/ollama/ollama/issues/5245 this is the relevant issue for workaround sharded gguf, quite the old issue..
Author
Owner

@kungfu-eric commented on GitHub (Feb 4, 2025):

closing after note

<!-- gh-comment-id:2635163259 --> @kungfu-eric commented on GitHub (Feb 4, 2025): closing after note
Author
Owner

@anirbanbasu commented on GitHub (Aug 23, 2025):

Try ollama-downloader. It has worked for me even behind a HTTPS proxy with a self-signed certificate, behind which ollama pull did not work for a single model, failing with the SHA256 check despite being able to download all BLOBs. (Disclaimer: I am the maintainer of the ollama-downloader project.)

<!-- gh-comment-id:3216808646 --> @anirbanbasu commented on GitHub (Aug 23, 2025): Try [ollama-downloader](https://github.com/anirbanbasu/ollama-downloader). It has worked for me even behind a HTTPS proxy with a self-signed certificate, behind which `ollama pull` did not work for a single model, failing with the SHA256 check despite being able to download all BLOBs. (Disclaimer: I am the maintainer of the _ollama-downloader_ project.)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#67748