[GH-ISSUE #14177] Can't remove partial downloads after canceling model pull #55754

Closed
opened 2026-04-29 09:41:51 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @swoorr on GitHub (Feb 9, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14177

The Problem

I started downloading qwen3-coder:30b but had to cancel it halfway through (Ctrl+C). The partial download files are still sitting on my disk taking up 17GB, but I can't remove them with ollama rm because
the model isn't actually "installed".

What happened

ollama run qwen3-coder:30b
pulling manifest
pulling 1194192cf2a1: 55% ▕█████████ ▏ 10 GB/ 18 GB 30 MB/s
^C

When I run it again, it resumes from 55%. That's nice, but what if I don't want it anymore?

ollama rm qwen3-coder:30b

Error: model 'qwen3-coder:30b' not found

ollama list

qwen3-coder not in the list

du -sh ~/.ollama

13G (!)

The partial files are in ~/.ollama/models/blobs/ with names like:

  • sha256-1194192cf2a187eb02722edcc3f77b11d21f537048ce04b67ccf8ba78863006a-partial
  • Plus a bunch of -partial-0, -partial-1, etc.

What I expected

Either ollama rm should work for partial downloads, or there should be something like ollama prune to clean up these orphaned files.

Workaround

Had to manually delete them:

rm ~/.ollama/models/blobs/-partial

This freed up ~10GB.

System

macOS, latest ollama
Would be great to have a cleaner way to handle this!

Originally created by @swoorr on GitHub (Feb 9, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14177 ## The Problem I started downloading qwen3-coder:30b but had to cancel it halfway through (Ctrl+C). The partial download files are still sitting on my disk taking up 17GB, but I can't remove them with `ollama rm` because the model isn't actually "installed". ## What happened ollama run qwen3-coder:30b pulling manifest pulling 1194192cf2a1: 55% ▕█████████ ▏ 10 GB/ 18 GB 30 MB/s ^C When I run it again, it resumes from 55%. That's nice, but what if I don't want it anymore? ollama rm qwen3-coder:30b Error: model 'qwen3-coder:30b' not found ollama list qwen3-coder not in the list du -sh ~/.ollama 13G (!) The partial files are in `~/.ollama/models/blobs/` with names like: - `sha256-1194192cf2a187eb02722edcc3f77b11d21f537048ce04b67ccf8ba78863006a-partial` - Plus a bunch of `-partial-0`, `-partial-1`, etc. ## What I expected Either `ollama rm` should work for partial downloads, or there should be something like `ollama prune` to clean up these orphaned files. ## Workaround Had to manually delete them: rm ~/.ollama/models/blobs/-partial This freed up ~10GB. ## System macOS, latest ollama Would be great to have a cleaner way to handle this!
GiteaMirror added the feature request label 2026-04-29 09:41:51 -05:00
Author
Owner

@swoorr commented on GitHub (Feb 9, 2026):

Update: Still an Issue

I found that PR #9489 was merged on Mar 4, 2025 and introduced the ollama prune command as the solution. However, when I try to use it:

ollama prune
Error: unknown command "prune" for "ollama"

This indicates that while the fix was merged, it hasn't been released in the current version yet. The partial download cleanup problem described in this issue is still affecting users.

Questions:

  • When will this feature be released to users?
  • Can we expedite the release since users are losing significant storage space (10-17GB in this case)?
  • Would it help to reference the related PRs (#9489, #13951) and issue #13885 to track the cleanup improvements?

This is still blocking real users from cleaning up orphaned partial files. The infrastructure exists but isn't accessible yet.

<!-- gh-comment-id:3873806091 --> @swoorr commented on GitHub (Feb 9, 2026): ## Update: Still an Issue I found that PR #9489 was merged on Mar 4, 2025 and introduced the `ollama prune` command as the solution. However, when I try to use it: ```bash ollama prune Error: unknown command "prune" for "ollama" ``` This indicates that while the fix was merged, it hasn't been released in the current version yet. The partial download cleanup problem described in this issue is still affecting users. **Questions:** - When will this feature be released to users? - Can we expedite the release since users are losing significant storage space (10-17GB in this case)? - Would it help to reference the related PRs (#9489, #13951) and issue #13885 to track the cleanup improvements? This is still blocking real users from cleaning up orphaned partial files. The infrastructure exists but isn't accessible yet.
Author
Owner

@rick-github commented on GitHub (Feb 9, 2026):

Restarting the server will run a housecleaning cycle and removed unused blobs.

<!-- gh-comment-id:3874062297 --> @rick-github commented on GitHub (Feb 9, 2026): Restarting the server will run a housecleaning cycle and removed unused blobs.
Author
Owner

@heapsoftware commented on GitHub (Feb 19, 2026):

No I still get this using the latest version with docker. I deleted this model using rm before I noticed these errors:
time=2026-02-19T20:28:35.445Z level=WARN source=routes.go:1677 msg="corrupt manifests detected, skipping prune operation. Re-pull or delete to clear" error="registry.ollama.ai/fotiecodes/jarvis:latest EOF"
root@ollama:# docker exec -it ollama ollama rm fotiecodes/jarvis:latest
Warning: unable to stop model 'fotiecodes/jarvis:latest'
Error: EOF
root@ollama:
# docker exec -it ollama ollama -v
ollama version is 0.16.2
root@ollama:# docker exec -it ollama ollama ls | grep jarvis
root@ollama:
#

<!-- gh-comment-id:3929830359 --> @heapsoftware commented on GitHub (Feb 19, 2026): No I still get this using the latest version with docker. I deleted this model using rm before I noticed these errors: time=2026-02-19T20:28:35.445Z level=WARN source=routes.go:1677 msg="corrupt manifests detected, skipping prune operation. Re-pull or delete to clear" error="registry.ollama.ai/fotiecodes/jarvis:latest EOF" root@ollama:~# docker exec -it ollama ollama rm fotiecodes/jarvis:latest Warning: unable to stop model 'fotiecodes/jarvis:latest' Error: EOF root@ollama:~# docker exec -it ollama ollama -v ollama version is 0.16.2 root@ollama:~# docker exec -it ollama ollama ls | grep jarvis root@ollama:~#
Author
Owner

@rick-github commented on GitHub (Feb 19, 2026):

docker exec -it ollama rm /root/.ollama/models/manifests/registry.ollama.ai/fotiecodes/jarvis/latest
<!-- gh-comment-id:3929853336 --> @rick-github commented on GitHub (Feb 19, 2026): ``` docker exec -it ollama rm /root/.ollama/models/manifests/registry.ollama.ai/fotiecodes/jarvis/latest ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#55754