[GH-ISSUE #2167] Deleting a model isn't removing Its blob #1238

Closed
opened 2026-04-12 11:00:38 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @racso-dev on GitHub (Jan 24, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2167

Bug Report

Description

Bug Summary:
When I try to delete a model through the UI in the settings it doesn't seem to work properly.

Steps to Reproduce:
Settings > Select a model to delete > Delete

Expected Behavior:
It should delete the model and /usr/share/ollama/.ollama/models/blobs shoud therefore not contain the blob of the model anymore.

Actual Behavior:
The blob of the model isn't removed from /usr/share/ollama/.ollama/models/blobs and therefore memory isn't freed

Environment

  • Operating System: Ubuntu 22.04
  • Browser (if applicable): Chrome Version 120.0.6099.224 (Official Build) (64-bit)

Reproduction Details

Confirmation:

  • [Y] I have read and followed all the instructions provided in the README.md.
  • [Y] I have reviewed the troubleshooting.md document.
  • [N] I have included the browser console logs. (Not relevant, but maybe I'm wrong)
  • [N] I have included the Docker container logs. (Not relevant, but maybe I'm wrong)

Installation Method

I installed the project, with building a docker container. I deployed the ollama inference server on a distant machine, that I included the url in the env of the docker container.

Originally created by @racso-dev on GitHub (Jan 24, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2167 # Bug Report ## Description **Bug Summary:** When I try to delete a model through the UI in the settings it doesn't seem to work properly. **Steps to Reproduce:** Settings > Select a model to delete > Delete **Expected Behavior:** It should delete the model and `/usr/share/ollama/.ollama/models/blobs` shoud therefore not contain the blob of the model anymore. **Actual Behavior:** The blob of the model isn't removed from `/usr/share/ollama/.ollama/models/blobs` and therefore memory isn't freed ## Environment - **Operating System:** Ubuntu 22.04 - **Browser (if applicable):** Chrome Version 120.0.6099.224 (Official Build) (64-bit) ## Reproduction Details **Confirmation:** - [Y] I have read and followed all the instructions provided in the README.md. - [Y] I have reviewed the troubleshooting.md document. - [N] I have included the browser console logs. (Not relevant, but maybe I'm wrong) - [N] I have included the Docker container logs. (Not relevant, but maybe I'm wrong) ## Installation Method I installed the project, with building a docker container. I deployed the ollama inference server on a distant machine, that I included the url in the env of the docker container.
Author
Owner

@mchiang0610 commented on GitHub (Jan 24, 2024):

@racso-dev sorry about this! May I ask how this was installed? Ollama doesn't yet have a GUI. Are you using the community project https://github.com/ollama-webui

<!-- gh-comment-id:1908595475 --> @mchiang0610 commented on GitHub (Jan 24, 2024): @racso-dev sorry about this! May I ask how this was installed? Ollama doesn't yet have a GUI. Are you using the community project https://github.com/ollama-webui
Author
Owner

@racso-dev commented on GitHub (Jan 25, 2024):

I installed ollama with curl https://ollama.ai/install.sh | sh and I'm indeed using the community project ollama-webui

<!-- gh-comment-id:1909816853 --> @racso-dev commented on GitHub (Jan 25, 2024): I installed ollama with `curl https://ollama.ai/install.sh | sh` and I'm indeed using the community project [ollama-webui](https://github.com/ollama-webui)
Author
Owner

@pdevine commented on GitHub (Jan 25, 2024):

Hey @racso-dev , we don't have a web ui, so I'm not sure how the front end you're using is trying to delete models.

That said, if you use the API to delete a model or if you use ollama rm <model>, the blobs that get deleted will depend on if there are other models which are using that same blob. Blobs are shared between models to deduplicate storage space. If the blob is shared with other models it won't get deleted until all of the models which reference it are deleted. If you want to check for what model is using that blob, there isn't a way to do this directly in ollama, however, you can:

cd /usr/share/ollama/.ollama/models && grep -R "sha256:<id of the blob>" *

Hope that helps. I'm going to go ahead and close the issue.

<!-- gh-comment-id:1911172213 --> @pdevine commented on GitHub (Jan 25, 2024): Hey @racso-dev , we don't have a web ui, so I'm not sure how the front end you're using is trying to delete models. That said, if you use the API to delete a model or if you use `ollama rm <model>`, the blobs that get deleted will depend on if there are other models which are using that same blob. Blobs are shared between models to deduplicate storage space. If the blob is shared with other models it won't get deleted until *all* of the models which reference it are deleted. If you want to check for what model is using that blob, there isn't a way to do this directly in ollama, however, you can: `cd /usr/share/ollama/.ollama/models && grep -R "sha256:<id of the blob>" *` Hope that helps. I'm going to go ahead and close the issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1238