[GH-ISSUE #3931] Digest mismatch, file must be downloaded again #64474

Closed
opened 2026-05-03 17:47:53 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @tttt-0814 on GitHub (Apr 26, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3931

What is the issue?

I tried to pull nomic-embed-text, but got an error below.
I also tried to pull another models, but got the same error.

$ ollama pull nomic-embed-text
pulling manifest
pulling 970aa74c0a90... 100% ▕███████████████████████████████████████████████████████████████████████████████████████████▏ 274 MB
pulling c71d239df917... 100% ▕███████████████████████████████████████████████████████████████████████████████████████████▏ 11 KB
pulling ce4a164fc046... 100% ▕███████████████████████████████████████████████████████████████████████████████████████████▏ 17 B
pulling 31df23ea7daa... 100% ▕███████████████████████████████████████████████████████████████████████████████████████████▏ 420 B
verifying sha256 digest
Error: digest mismatch, file must be downloaded again: want sha256:970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6, got sha256:bea7e365d4085c35b0cfc78f9285682c6c7df7a15ac58f1905542659871024fd

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.1.32

Originally created by @tttt-0814 on GitHub (Apr 26, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3931 ### What is the issue? I tried to pull nomic-embed-text, but got an error below. I also tried to pull another models, but got the same error. $ ollama pull nomic-embed-text pulling manifest pulling 970aa74c0a90... 100% ▕███████████████████████████████████████████████████████████████████████████████████████████▏ 274 MB pulling c71d239df917... 100% ▕███████████████████████████████████████████████████████████████████████████████████████████▏ 11 KB pulling ce4a164fc046... 100% ▕███████████████████████████████████████████████████████████████████████████████████████████▏ 17 B pulling 31df23ea7daa... 100% ▕███████████████████████████████████████████████████████████████████████████████████████████▏ 420 B verifying sha256 digest Error: digest mismatch, file must be downloaded again: want sha256:970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6, got sha256:bea7e365d4085c35b0cfc78f9285682c6c7df7a15ac58f1905542659871024fd ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.32
GiteaMirror added the bug label 2026-05-03 17:47:53 -05:00
Author
Owner

@FlorianBoehler commented on GitHub (Apr 26, 2024):

OS
Linux (Ubuntu 22.04.4 LTS)

Ollama version
0.1.32

Same issue:
Proxy is already set as Environment in ollama.service.

$ ollama pull nomic-embed-text
pulling manifest
pulling 970aa74c0a90... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 274 MB
pulling c71d239df917... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 11 KB
pulling ce4a164fc046... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 17 B
pulling 31df23ea7daa... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 420 B
verifying sha256 digest
Error: digest mismatch, file must be downloaded again: want sha256:970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6, got sha256:bea7e365d4085c35b0cfc78f9285682c6c7df7a15ac58f1905542659871024fd

<!-- gh-comment-id:2078927015 --> @FlorianBoehler commented on GitHub (Apr 26, 2024): OS Linux (Ubuntu 22.04.4 LTS) Ollama version 0.1.32 Same issue: Proxy is already set as Environment in ollama.service. $ ollama pull nomic-embed-text pulling manifest pulling 970aa74c0a90... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 274 MB pulling c71d239df917... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 11 KB pulling ce4a164fc046... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 17 B pulling 31df23ea7daa... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 420 B verifying sha256 digest Error: digest mismatch, file must be downloaded again: want sha256:970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6, got sha256:bea7e365d4085c35b0cfc78f9285682c6c7df7a15ac58f1905542659871024fd
Author
Owner

@FlorianBoehler commented on GitHub (Apr 26, 2024):

A similar error occurs when trying to create a model from a .gguf file

$ ollama create Llama3:1 -f /home/user/downloads/Modelfile
transferring model data
Error: digest mismatch, expected "sha256:133b647485d69ae9458aba8448342b8c5495b92343f1df87fefbaf374ab4edc6", got "sha256:b86ae9d4b692d139592a7794ca507067d73e9e3ca2366fa7291c33a19c6254c6"

<!-- gh-comment-id:2079333143 --> @FlorianBoehler commented on GitHub (Apr 26, 2024): A similar error occurs when trying to create a model from a .gguf file $ ollama create Llama3:1 -f /home/user/downloads/Modelfile transferring model data Error: digest mismatch, expected "sha256:133b647485d69ae9458aba8448342b8c5495b92343f1df87fefbaf374ab4edc6", got "sha256:b86ae9d4b692d139592a7794ca507067d73e9e3ca2366fa7291c33a19c6254c6"
Author
Owner

@TheQuantumbyte commented on GitHub (Apr 26, 2024):

Same issue on MacOS trying to pull any model (phi3, llama3). It hangs at 100% pulling the model file for several minutes, then does the rest quickly and gives me the same digest mismatch error.

% ollama pull phi3              
pulling manifest 
pulling 4fed7364ee3e... 100% ▕███████████████████████████████████████▏ 2.3 GB                         
pulling c608dc615584... 100% ▕███████████████████████████████████████▏  149 B                         
pulling fa8235e5b48f... 100% ▕███████████████████████████████████████▏ 1.1 KB                         
pulling d47ab88b61ba... 100% ▕███████████████████████████████████████▏  140 B                         
pulling f7eda1da5a81... 100% ▕███████████████████████████████████████▏  485 B                         
verifying sha256 digest 
Error: digest mismatch, file must be downloaded again: want sha256:4fed7364ee3e0c7cb4fe0880148bfdfcd1b630981efa0802a6b62ee52e7da97e, got sha256:c9cc8bfb526a9ce885b566f7b034769ca850df757292424af8ad2406826324ec

<!-- gh-comment-id:2079513911 --> @TheQuantumbyte commented on GitHub (Apr 26, 2024): Same issue on MacOS trying to pull any model (phi3, llama3). It hangs at 100% pulling the model file for several minutes, then does the rest quickly and gives me the same digest mismatch error. ``` % ollama pull phi3 pulling manifest pulling 4fed7364ee3e... 100% ▕███████████████████████████████████████▏ 2.3 GB pulling c608dc615584... 100% ▕███████████████████████████████████████▏ 149 B pulling fa8235e5b48f... 100% ▕███████████████████████████████████████▏ 1.1 KB pulling d47ab88b61ba... 100% ▕███████████████████████████████████████▏ 140 B pulling f7eda1da5a81... 100% ▕███████████████████████████████████████▏ 485 B verifying sha256 digest Error: digest mismatch, file must be downloaded again: want sha256:4fed7364ee3e0c7cb4fe0880148bfdfcd1b630981efa0802a6b62ee52e7da97e, got sha256:c9cc8bfb526a9ce885b566f7b034769ca850df757292424af8ad2406826324ec ```
Author
Owner

@nb001 commented on GitHub (Apr 26, 2024):

I met the same situation.

<!-- gh-comment-id:2080187346 --> @nb001 commented on GitHub (Apr 26, 2024): I met the same situation.
Author
Owner

@TheQuantumbyte commented on GitHub (Apr 29, 2024):

Just as an added piece to the puzzle, I've been trying this on my work computer, which is running DNS filtering and other corporate software. I tried it at home with my personal machine and it worked correctly. I'm not sure if this failure is by design on the part of the DNS filtering solution or just an accident, but this was working fine last time I downloaded a model about 3 weeks ago.

<!-- gh-comment-id:2082967951 --> @TheQuantumbyte commented on GitHub (Apr 29, 2024): Just as an added piece to the puzzle, I've been trying this on my work computer, which is running DNS filtering and other corporate software. I tried it at home with my personal machine and it worked correctly. I'm not sure if this failure is by design on the part of the DNS filtering solution or just an accident, but this was working fine last time I downloaded a model about 3 weeks ago.
Author
Owner

@jmorganca commented on GitHub (May 9, 2024):

merging this with https://github.com/ollama/ollama/issues/941

<!-- gh-comment-id:2103422430 --> @jmorganca commented on GitHub (May 9, 2024): merging this with https://github.com/ollama/ollama/issues/941
Author
Owner

@ThomasGmeinder commented on GitHub (Sep 16, 2024):

I still get this error with the latest Ollama 0.3.10 on Windows 11. I have been getting this error with the three previous versions too. The corporate network is not the issue because Ollama works fine for me on other Windows 11 machines in the corporate network and the error persists when I switch to a private network.

Command to reproduce: ollama run llama3.1:8b

<!-- gh-comment-id:2352244863 --> @ThomasGmeinder commented on GitHub (Sep 16, 2024): I still get this error with the latest Ollama 0.3.10 on Windows 11. I have been getting this error with the three previous versions too. The corporate network is not the issue because Ollama works fine for me on other Windows 11 machines in the corporate network and the error persists when I switch to a private network. Command to reproduce: ollama run llama3.1:8b
Author
Owner

@michaelrussell4 commented on GitHub (Nov 20, 2024):

I've had the same issue. This happens every time I try to download any model, the only exception and case where I've been able to download is all-minilm, maybe because it's only 45 MB?

Windows 11

<!-- gh-comment-id:2489019302 --> @michaelrussell4 commented on GitHub (Nov 20, 2024): I've had the same issue. This happens every time I try to download any model, the only exception and case where I've been able to download is `all-minilm`, maybe because it's only 45 MB? Windows 11
Author
Owner

@doug62 commented on GitHub (Jan 30, 2025):

This will fix it - for some reason:
echo "nameserver 8.8.8.8" > /etc/resolv.conf # Thanks to -> https://www.youtube.com/watch?v=iSI-corX8HI

<!-- gh-comment-id:2623391542 --> @doug62 commented on GitHub (Jan 30, 2025): This will fix it - for some reason: echo "nameserver 8.8.8.8" > /etc/resolv.conf # Thanks to -> https://www.youtube.com/watch?v=iSI-corX8HI
Author
Owner

@keesj commented on GitHub (Jan 8, 2026):

This will fix it - for some reason: echo "nameserver 8.8.8.8" > /etc/resolv.conf # Thanks to -> https://www.youtube.com/watch?v=iSI-corX8HI

I do indeed see different behaviour. In my case I am using the snap package

<!-- gh-comment-id:3724971269 --> @keesj commented on GitHub (Jan 8, 2026): > This will fix it - for some reason: echo "nameserver 8.8.8.8" > /etc/resolv.conf # Thanks to -> https://www.youtube.com/watch?v=iSI-corX8HI I do indeed see different behaviour. In my case I am using the snap package
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64474