[GH-ISSUE #7725] How to check the actual location where the model file is saved, and the directory queried by 'ollama list' #4931

Closed
opened 2026-04-12 15:59:33 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @supersaiyan2019 on GitHub (Nov 18, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7725

What is the issue?

encountered an error while using the new model minicpm-v #6751,still this issue...

Since installing minicpm-v, my ollama version has always stayed at 0.3.6. My problem #6751 has never been solved. I have completely deleted ollama, restarted windows, and reinstalled ollama. As long as minicpm-v is relisted , my version became 0.3.6, and the OLLAMA_MODELS setting became invalid, and I don’t know where the new model was downloaded.

Here's what I painfully tried...

ollama -v
ollama version is 0.3.6
Warning: client version is 0.4.2
in #6751, my client version was 0.3.10, and now 0.4.2, but the server version is always 0.3.6....

ollama rm minicpm-v:latest
Error: unable to stop existing running model "minicpm-v:latest": llama runner process has terminated: GGML_ASSERT(new_clip->has_llava_projector) failed
I want to remove it ,no way.

ollama run minicpm-v:latest
Error: llama runner process has terminated: GGML_ASSERT(new_clip->has_llava_projector) failed
Obviously run won’t work either.

image
Directory set by OLLAMA_MODELS ,no minicpm-v files

image
The default user/.ollama directory has no models directory.

image
But 'ollama list' can search it out, but it can only be created, not deleted, or used.

What I want to do most now is to delete these three damn models and not list them. I can list them on the command line of this machine, but when calling from the clients, even the list will report an error. Please tell me how to find their location and delete them permanently, or how to clean up all the environment when uninstalling ollama, so that I can get rid of the pain.

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.4.2

Originally created by @supersaiyan2019 on GitHub (Nov 18, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7725 ### What is the issue? encountered an error while using the new model minicpm-v #6751,still this issue... Since installing minicpm-v, my ollama version has always stayed at 0.3.6. My problem #6751 has never been solved. I have completely deleted ollama, restarted windows, and reinstalled ollama. As long as minicpm-v is relisted , my version became 0.3.6, and the OLLAMA_MODELS setting became invalid, and I don’t know where the new model was downloaded. Here's what I painfully tried... ollama -v ollama version is 0.3.6 Warning: client version is 0.4.2 in #6751, my client version was 0.3.10, and now 0.4.2, but the server version is always 0.3.6.... ollama rm minicpm-v:latest Error: unable to stop existing running model "minicpm-v:latest": llama runner process has terminated: GGML_ASSERT(new_clip->has_llava_projector) failed I want to remove it ,no way. ollama run minicpm-v:latest Error: llama runner process has terminated: GGML_ASSERT(new_clip->has_llava_projector) failed Obviously run won’t work either. ![image](https://github.com/user-attachments/assets/c6590784-6c88-4a38-bf44-87da889dbf1e) Directory set by OLLAMA_MODELS ,no minicpm-v files ![image](https://github.com/user-attachments/assets/1ad8f255-fb7d-4a18-a37c-d2f1535b162f) The default user/.ollama directory has no models directory. ![image](https://github.com/user-attachments/assets/1dfefa0c-c7fd-47b5-8512-50eebfc61eb6) But 'ollama list' can search it out, but it can only be created, not deleted, or used. What I want to do most now is to delete these three damn models and not list them. I can list them on the command line of this machine, but when calling from the clients, even the list will report an error. Please tell me how to find their location and delete them permanently, or how to clean up all the environment when uninstalling ollama, so that I can get rid of the pain. ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.4.2
GiteaMirror added the wslbug labels 2026-04-12 15:59:33 -05:00
Author
Owner

@rick-github commented on GitHub (Nov 18, 2024):

You have two ollama servers running. Have you previously installed ollama in WSL (https://github.com/ollama/ollama/issues/6701)?

<!-- gh-comment-id:2482829543 --> @rick-github commented on GitHub (Nov 18, 2024): You have two ollama servers running. Have you previously installed ollama in WSL (https://github.com/ollama/ollama/issues/6701)?
Author
Owner

@supersaiyan2019 commented on GitHub (Nov 18, 2024):

thx @rick-github

<!-- gh-comment-id:2482915026 --> @supersaiyan2019 commented on GitHub (Nov 18, 2024): thx @rick-github
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4931