Deleting a model isn't removing Its blob #219

Closed
opened 2025-11-11 14:11:40 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @racso-dev on GitHub (Jan 22, 2024).

Bug Report

Description

Bug Summary:
When I try to delete a model through the UI in the settings it doesn't seem to work properly.

Steps to Reproduce:
Settings > Select a model to delete > Delete

Expected Behavior:
It should delete the model and /usr/share/ollama/.ollama/models/blobs shoud therefore not contain the blob of the model anymore.

Actual Behavior:
The blob of the model isn't removed from /usr/share/ollama/.ollama/models/blobs and therefore memory isn't freed

Environment

  • Operating System: Ubuntu 22.04
  • Browser (if applicable): Chrome Version 120.0.6099.224 (Official Build) (64-bit)

Reproduction Details

Confirmation:

  • [Y] I have read and followed all the instructions provided in the README.md.
  • [Y] I have reviewed the troubleshooting.md document.
  • [N] I have included the browser console logs. (Not relevant, but maybe I'm wrong)
  • [N] I have included the Docker container logs. (Not relevant, but maybe I'm wrong)

Installation Method

I installed the project, with building a docker container. I deployed the ollama inference server on a distant machine, that I included the url in the env of the docker container.

Originally created by @racso-dev on GitHub (Jan 22, 2024). # Bug Report ## Description **Bug Summary:** When I try to delete a model through the UI in the settings it doesn't seem to work properly. **Steps to Reproduce:** Settings > Select a model to delete > Delete **Expected Behavior:** It should delete the model and `/usr/share/ollama/.ollama/models/blobs` shoud therefore not contain the blob of the model anymore. **Actual Behavior:** The blob of the model isn't removed from `/usr/share/ollama/.ollama/models/blobs` and therefore memory isn't freed ## Environment - **Operating System:** Ubuntu 22.04 - **Browser (if applicable):** Chrome Version 120.0.6099.224 (Official Build) (64-bit) ## Reproduction Details **Confirmation:** - [Y] I have read and followed all the instructions provided in the README.md. - [Y] I have reviewed the troubleshooting.md document. - [N] I have included the browser console logs. (Not relevant, but maybe I'm wrong) - [N] I have included the Docker container logs. (Not relevant, but maybe I'm wrong) ## Installation Method I installed the project, with building a docker container. I deployed the ollama inference server on a distant machine, that I included the url in the env of the docker container.
Author
Owner

@justinh-rahb commented on GitHub (Jan 22, 2024):

This should be filed with the upstream project, Ollama. Nothing we can do about it here. Ollama does cleanup orphan blobs when it starts.

@justinh-rahb commented on GitHub (Jan 22, 2024): This should be filed with the upstream project, Ollama. Nothing we can do about it here. Ollama does cleanup orphan blobs when it starts.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#219