[GH-ISSUE #213] ollama rm <model> doesn't remove cache #87

Closed
opened 2026-04-12 09:37:49 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @kusold on GitHub (Jul 25, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/213

I had an internet hiccup while downloading the model, which left it in a corrupt state. In order to redownload the model, I did ollama rm llama2, but when I went to re-pull the model it used the cache in ~/.ollama/models (3.8/3.8 GB, 17 TB/s -- I wish my internet was that fast).

❯ ollama list
NAME	SIZE	MODIFIED

~
❯ ollama pull llama2
pulling manifest
pulling 8daa9615cce3... 100% |█████████████████████████████████████████████████████████████████| (3.8/3.8 GB, 17 TB/s)
pulling 2cc93ea1ade8... 100% |███████████████████████████████████████████████████████████████████| (90/90 B, 1.4 MB/s)
pulling a73730bc2562... 100% |█████████████████████████████████████████████████████████████████| (509/509 B, 6.7 MB/s)
pulling 13af22070723... 100% |█████████████████████████████████████████████████████████████████| (4.4/4.4 kB, 67 MB/s)
pulling 6d9acd31eb66... 100% |█████████████████████████████████████████████████████████████████| (373/373 B, 3.8 MB/s)
verifying sha256 digest
Error: stream: digest mismatch: want sha256:8daa9615cce30c259a9555b1cc250d461d1bc69980a274b44d7eda0be78076d8, got sha256:d432b8ee86866337825aab3e6fa502bb4cb85701fd5f83c0cc12f86de28fb4a5

After rm -rf ~/.ollama/models/* && ollama pull llama2 everything started working again. Is it expected that the model blobs should remain after ollama rm?

Originally created by @kusold on GitHub (Jul 25, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/213 I had an internet hiccup while downloading the model, which left it in a corrupt state. In order to redownload the model, I did `ollama rm llama2`, but when I went to re-pull the model it used the cache in `~/.ollama/models` (3.8/3.8 GB, 17 TB/s -- I wish my internet was that fast). ``` ❯ ollama list NAME SIZE MODIFIED ~ ❯ ollama pull llama2 pulling manifest pulling 8daa9615cce3... 100% |█████████████████████████████████████████████████████████████████| (3.8/3.8 GB, 17 TB/s) pulling 2cc93ea1ade8... 100% |███████████████████████████████████████████████████████████████████| (90/90 B, 1.4 MB/s) pulling a73730bc2562... 100% |█████████████████████████████████████████████████████████████████| (509/509 B, 6.7 MB/s) pulling 13af22070723... 100% |█████████████████████████████████████████████████████████████████| (4.4/4.4 kB, 67 MB/s) pulling 6d9acd31eb66... 100% |█████████████████████████████████████████████████████████████████| (373/373 B, 3.8 MB/s) verifying sha256 digest Error: stream: digest mismatch: want sha256:8daa9615cce30c259a9555b1cc250d461d1bc69980a274b44d7eda0be78076d8, got sha256:d432b8ee86866337825aab3e6fa502bb4cb85701fd5f83c0cc12f86de28fb4a5 ``` After `rm -rf ~/.ollama/models/* && ollama pull llama2` everything started working again. Is it expected that the model blobs should remain after `ollama rm`?
GiteaMirror added the bug label 2026-04-12 09:37:49 -05:00
Author
Owner

@jmorganca commented on GitHub (Oct 11, 2023):

Hi there @kusold. Going to close this issue, but please do let me know if you continue seeing this error

<!-- gh-comment-id:1756492535 --> @jmorganca commented on GitHub (Oct 11, 2023): Hi there @kusold. Going to close this issue, but please do let me know if you continue seeing this error
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#87