[GH-ISSUE #695] Can't resume download (pull) on restart server #26080

Closed
opened 2026-04-22 02:00:28 -05:00 by GiteaMirror · 9 comments
Owner

Originally created by @KcZLog on GitHub (Oct 4, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/695

Auto pruning on server start was added in #491
But this cause losing unfinished/failed download progress if restarting server

Please change this to allow continuing downloads.

Suggestions:

  1. Don't auto prune, pruning on delete is probably enough?
  2. If want auto prune, use seperate directory for unfinished download, or use name prefix (may orphan unfinished file if new version)
  3. Use a file to list every downloads, and don't prune those files. If new version, update the list, previous unlisted files automatically get pruned.
  4. Immediately create the manifest before download, to prevent pruning those files, and add + check property for incomplete download.
Originally created by @KcZLog on GitHub (Oct 4, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/695 Auto pruning on server start was added in #491 But this cause losing unfinished/failed download progress if restarting server Please change this to allow continuing downloads. Suggestions: 1. Don't auto prune, pruning on delete is probably enough? 2. If want auto prune, use seperate directory for unfinished download, or use name prefix (may orphan unfinished file if new version) 3. Use a file to list every downloads, and don't prune those files. If new version, update the list, previous unlisted files automatically get pruned. 4. Immediately create the manifest before download, to prevent pruning those files, and add + check property for incomplete download.
GiteaMirror added the bug label 2026-04-22 02:00:28 -05:00
Author
Owner

@BruceMacD commented on GitHub (Oct 5, 2023):

Hi @KcZLog as a workaround you can set the OLLAMA_NOPRUNE environment variable.

For example:

OLLAMA_NOPRUNE=true ollama serve

Or adding it to your $PATH.

<!-- gh-comment-id:1749043351 --> @BruceMacD commented on GitHub (Oct 5, 2023): Hi @KcZLog as a workaround you can set the `OLLAMA_NOPRUNE` environment variable. For example: ``` OLLAMA_NOPRUNE=true ollama serve ``` Or adding it to your $PATH.
Author
Owner

@KcZLog commented on GitHub (Oct 5, 2023):

Hi thank you, I'm aware and have set the OLLAMA_NOPRUNE env since finding #491, and it does prevent pruning on start/serve.

But others who have download issues, may be frustrated to lose their download progress when restarting server, especially #330 hints to restart the server. Hopefully this gets fixed soon.

<!-- gh-comment-id:1749328329 --> @KcZLog commented on GitHub (Oct 5, 2023): Hi thank you, I'm aware and have set the `OLLAMA_NOPRUNE` env since finding #491, and it does prevent pruning on start/serve. But others who have download issues, may be frustrated to lose their download progress when restarting server, especially #330 hints to restart the server. Hopefully this gets fixed soon.
Author
Owner

@technovangelist commented on GitHub (Dec 4, 2023):

It looks like Bruce's comment solved your issue. I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.

<!-- gh-comment-id:1839352786 --> @technovangelist commented on GitHub (Dec 4, 2023): It looks like Bruce's comment solved your issue. I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.
Author
Owner

@technovangelist commented on GitHub (Dec 9, 2023):

actually, it looks like the only mention of that environment variable is in a buried issue and PR. I am updating the docs to mention this.

<!-- gh-comment-id:1847994442 --> @technovangelist commented on GitHub (Dec 9, 2023): actually, it looks like the only mention of that environment variable is in a buried issue and PR. I am updating the docs to mention this.
Author
Owner

@technovangelist commented on GitHub (Dec 9, 2023):

And I am going to re open this, because NOPRUNE is definitely a workaround and not the actual solution.

<!-- gh-comment-id:1848073773 --> @technovangelist commented on GitHub (Dec 9, 2023): And I am going to re open this, because NOPRUNE is definitely a workaround and not the actual solution.
Author
Owner

@mxyng commented on GitHub (Jan 16, 2024):

Pruning by default is the desired behaviour. The issue seems to be restarting the server after a failed download which should not be the go to solution. Instead, repulling will resume where the previous download left off.

Restarting the server should be the last resort.

<!-- gh-comment-id:1894615049 --> @mxyng commented on GitHub (Jan 16, 2024): Pruning by default is the desired behaviour. The issue seems to be restarting the server after a failed download which should _not_ be the go to solution. Instead, repulling will resume where the previous download left off. Restarting the server should be the last resort.
Author
Owner

@cyangalaxy commented on GitHub (Aug 3, 2024):

Ollama's pull feature, for me, has been completely unreliable. 90% of the time the download stops.

Because of this, I instead download the models (as GGUF file) from HuggingFace and import them. This way I have the model as a file and don't need to redownload the models in case I move computers or reinstall my OS.

<!-- gh-comment-id:2266717594 --> @cyangalaxy commented on GitHub (Aug 3, 2024): Ollama's pull feature, for me, has been completely unreliable. 90% of the time the download stops. Because of this, I instead download the models (as GGUF file) from HuggingFace and import them. This way I have the model as a file and don't need to redownload the models in case I move computers or reinstall my OS.
Author
Owner

@siakc commented on GitHub (Dec 16, 2024):

Recently I tried to pull llama3.3 which is about 40 GBs. It stuck at 50%. I canceled and repeated the pull. It stuck again. Had to restart the system. This time it started from start while NOPRUNE was set.

<!-- gh-comment-id:2546172376 --> @siakc commented on GitHub (Dec 16, 2024): Recently I tried to pull llama3.3 which is about 40 GBs. It stuck at 50%. I canceled and repeated the pull. It stuck again. Had to restart the system. This time it started from start while NOPRUNE was set.
Author
Owner

@exalented commented on GitHub (Oct 27, 2025):

Extremely annoying. Just make the OLLAMA_NOPRUNE=true the default and then let ppl prune on their own as losing a download sucks on slow connections.

<!-- gh-comment-id:3449644514 --> @exalented commented on GitHub (Oct 27, 2025): Extremely annoying. Just make the `OLLAMA_NOPRUNE=true` the default and then let ppl prune on their own as losing a download sucks on slow connections.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26080