[GH-ISSUE #6668] Every installed model disappeared #50706

Closed
opened 2026-04-28 16:48:53 -05:00 by GiteaMirror · 22 comments
Owner

Originally created by @yilmaz08 on GitHub (Sep 6, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6668

What is the issue?

After opening my pc today, I've realized that I was not able to use any ollama models. The ollama daemon is running but ollama ls doesn't show anything. I tried reinstalling llama3.1:8b and it works.

Somehow every installed model disappeared and I need to reinstall all of them. (It is not a huge problem for me but I wanted to report it)

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.3.9

Originally created by @yilmaz08 on GitHub (Sep 6, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6668 ### What is the issue? After opening my pc today, I've realized that I was not able to use any ollama models. The ollama daemon is running but `ollama ls` doesn't show anything. I tried reinstalling llama3.1:8b and it works. Somehow every installed model disappeared and I need to reinstall all of them. (It is not a huge problem for me but I wanted to report it) ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.9
GiteaMirror added the bug label 2026-04-28 16:48:53 -05:00
Author
Owner

@kyechou commented on GitHub (Sep 6, 2024):

I had the same issue today. You might want to consider the following.

Disclaimer: Do not run commands from strangers online without understanding them first.

cd /var/lib/ollama # or /usr/share/ollama, depending on your system
mv blobs blobs.backup
mv manifests manifests.backup
mv .ollama/models/blobs .ollama/models/manifests ./
rmdir .ollama/models
ollama ls
<!-- gh-comment-id:2334562707 --> @kyechou commented on GitHub (Sep 6, 2024): I had the same issue today. You might want to consider the following. > Disclaimer: Do not run commands from strangers online without understanding them first. ```sh cd /var/lib/ollama # or /usr/share/ollama, depending on your system mv blobs blobs.backup mv manifests manifests.backup mv .ollama/models/blobs .ollama/models/manifests ./ rmdir .ollama/models ollama ls ```
Author
Owner

@rick-github commented on GitHub (Sep 6, 2024):

Which distro? What install method did you use?

<!-- gh-comment-id:2334862450 --> @rick-github commented on GitHub (Sep 6, 2024): Which distro? What install method did you use?
Author
Owner

@yilmaz08 commented on GitHub (Sep 7, 2024):

I had the same issue today. You might want to consider the following.

Disclaimer: Do not run commands from strangers online without understanding them first.

cd /var/lib/ollama # or /usr/share/ollama, depending on your system
mv blobs blobs.backup
mv manifests manifests.backup
mv .ollama/models/blobs .ollama/models/manifests ./
rmdir .ollama/models
ollama ls

It worked thank you so much!

Which distro? What install method did you use?

on Arch Linux and the package is extra/ollama-cuda from official repositories

<!-- gh-comment-id:2335099077 --> @yilmaz08 commented on GitHub (Sep 7, 2024): > I had the same issue today. You might want to consider the following. > > > Disclaimer: Do not run commands from strangers online without understanding them first. > > ```shell > cd /var/lib/ollama # or /usr/share/ollama, depending on your system > mv blobs blobs.backup > mv manifests manifests.backup > mv .ollama/models/blobs .ollama/models/manifests ./ > rmdir .ollama/models > ollama ls > ``` It worked thank you so much! > Which distro? What install method did you use? on Arch Linux and the package is [extra/ollama-cuda](https://archlinux.org/packages/extra/x86_64/ollama-cuda/) from official repositories
Author
Owner

@rick-github commented on GitHub (Sep 7, 2024):

Another Arch user reported the same issue (https://github.com/ollama/ollama/issues/6639) so this should be reported to the Arch repo.

<!-- gh-comment-id:2335164414 --> @rick-github commented on GitHub (Sep 7, 2024): Another Arch user reported the same issue (https://github.com/ollama/ollama/issues/6639) so this should be [reported](https://gitlab.archlinux.org/archlinux/packaging/packages/ollama/-/issues) to the Arch repo.
Author
Owner

@olokelo commented on GitHub (Sep 10, 2024):

Yes I use arch btw but I don't think it should be reported. It was a decision of package maintainer and it even gives you warning when upgrading so wouldn't consider that a bug.

<!-- gh-comment-id:2341906341 --> @olokelo commented on GitHub (Sep 10, 2024): Yes I use arch btw but I don't think it should be reported. It was a decision of package maintainer and it even gives you warning when upgrading so wouldn't consider that a bug.
Author
Owner

@rick-github commented on GitHub (Sep 10, 2024):

Fair enough, if it's deliberate and the package gives a warning then it's on the user to manage the problem.

<!-- gh-comment-id:2341932133 --> @rick-github commented on GitHub (Sep 10, 2024): Fair enough, if it's deliberate and the package gives a warning then it's on the user to manage the problem.
Author
Owner

@onexzero commented on GitHub (Feb 19, 2025):

I have same issue
ollama version is 0.5.7-0-ga420a45-dirty
Linux ca3befed29d5 5.15.153.1-microsoft-standard-WSL2 #1 SMP Fri Mar 29 23:14:13 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux

<!-- gh-comment-id:2668348790 --> @onexzero commented on GitHub (Feb 19, 2025): I have same issue ollama version is 0.5.7-0-ga420a45-dirty Linux ca3befed29d5 5.15.153.1-microsoft-standard-WSL2 #1 SMP Fri Mar 29 23:14:13 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Author
Owner

@aomegaai commented on GitHub (May 5, 2025):

Same here! Super annoying.

<!-- gh-comment-id:2850874875 --> @aomegaai commented on GitHub (May 5, 2025): Same here! Super annoying.
Author
Owner

@gkzsolt commented on GitHub (Sep 13, 2025):

Same here. What I did is I enabled the ollama systemd service and restarted ollama service. Then the pulled images disappeared! ??

<!-- gh-comment-id:3288442469 --> @gkzsolt commented on GitHub (Sep 13, 2025): Same here. What I did is I enabled the ollama systemd service and restarted ollama service. Then the pulled images disappeared! ??
Author
Owner

@rick-github commented on GitHub (Sep 13, 2025):

If you were previously running ollama as ollama serve, the models are in ~/.ollama/models. If you are now running ollama via systemd, the models are in ~ollama/.ollama/models.

<!-- gh-comment-id:3288456473 --> @rick-github commented on GitHub (Sep 13, 2025): If you were previously running ollama as `ollama serve`, the models are in ~/.ollama/models. If you are now running ollama via systemd, the models are in ~ollama/.ollama/models.
Author
Owner

@gkzsolt commented on GitHub (Sep 13, 2025):

Thanks, Rick. I am a bit confused about the different modes of running. First, all documentation is talking about ollama pull <model>, ollama list, ollama run <model> - which is a cli. Then there is ollama serve and then the systemd ollama service. Currently I am running ollama serve, to be able to access it remotely, but the systemd service is also active.
What mode I am when I am calling ollama run?
How can I run it via systemd?

<!-- gh-comment-id:3288479522 --> @gkzsolt commented on GitHub (Sep 13, 2025): Thanks, Rick. I am a bit confused about the different modes of running. First, all documentation is talking about `ollama pull <model>`, `ollama list`, `ollama run <model>` - which is a cli. Then there is `ollama serve` and then the systemd ollama service. Currently I am running `ollama serve`, to be able to access it remotely, but the systemd service is also active. What mode I am when I am calling `ollama run`? How can I run it via systemd?
Author
Owner

@rick-github commented on GitHub (Sep 13, 2025):

Ollama is a client/server architecture with the one binary serving both roles. Run as ollama serve, the program opens a listener on port 11434 and waits for commands. When run as ollama pull, ollama run, ollama list etc, it sends a request to the ollama server to accomplish the requested command. For example, ollama run mymodel sends a command to the server to run the requested model. If the model is not in local storage, the server will download the model. The server then starts a subprocess called the runner whose job it is to load the model from disk into the GPU and then start the inference. The server monitors the output from the runner and sends it back to the client, ie the ollama run command, which then outputs the content to your terminal screen.

If you start the server manually (ollama serve), the server is running as you, so any state (models, keys) generated by the server is stored in your home directory in .ollama, ie ~/.ollama. When the server is started by the systemd service, it is running as the ollama user, so any state (models, keys) generated by the server is stored in the home directory of the ollama user, in ~ollama/.ollama.

If the systemd service is active, you don't need (and shouldn't) run ollama manually as ollama serve. If you want to access the ollama server remotely, you need to run systemctl edit ollama and set OLLAMA_HOST=0.0.0.0as an Environment variable.

<!-- gh-comment-id:3288515856 --> @rick-github commented on GitHub (Sep 13, 2025): Ollama is a client/server architecture with the one binary serving both roles. Run as `ollama serve`, the program opens a listener on port 11434 and waits for commands. When run as `ollama pull`, `ollama run`, `ollama list` etc, it sends a request to the ollama server to accomplish the requested command. For example, `ollama run mymodel` sends a command to the server to run the requested model. If the model is not in local storage, the server will download the model. The server then starts a subprocess called the `runner` whose job it is to load the model from disk into the GPU and then start the inference. The server monitors the output from the runner and sends it back to the client, ie the `ollama run` command, which then outputs the content to your terminal screen. If you start the server manually (`ollama serve`), the server is running as you, so any state (models, keys) generated by the server is stored in your home directory in `.ollama`, ie `~/.ollama`. When the server is started by the systemd service, it is running as the ollama user, so any state (models, keys) generated by the server is stored in the home directory of the ollama user, in `~ollama/.ollama`. If the systemd service is active, you don't need (and shouldn't) run ollama manually as `ollama serve`. If you want to access the ollama server remotely, you need to run `systemctl edit ollama` and set `OLLAMA_HOST=0.0.0.0`as an `Environment` variable.
Author
Owner

@gkzsolt commented on GitHub (Sep 13, 2025):

If the systemd service is active, you don't need (and shouldn't) run ollama manually as ollama serve.

If I do not run ollama serve the command line ollama cannot connect to the server:

$ ollama list
Error: ollama server not responding - could not connect to ollama server, run 'ollama serve' to start it

$ systemctl status ollama.service
● ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: enabled)
    Drop-In: /etc/systemd/system/ollama.service.d
             └─override.conf
     Active: active (running) since Sat 2025-09-13 15:36:48 CEST; 5h 16min ago
   Main PID: 53246 (ollama)
      Tasks: 13 (limit: 76870)
     Memory: 25.1M (peak: 43.1M)
        CPU: 506ms
     CGroup: /system.slice/ollama.service
             └─53246 /usr/local/bin/ollama serve

How can I make use of the systemd ollama service?

<!-- gh-comment-id:3288734977 --> @gkzsolt commented on GitHub (Sep 13, 2025): > If the systemd service is active, you don't need (and shouldn't) run ollama manually as ollama serve. If I do not run `ollama serve` the command line `ollama` cannot connect to the server: ``` $ ollama list Error: ollama server not responding - could not connect to ollama server, run 'ollama serve' to start it $ systemctl status ollama.service ● ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: enabled) Drop-In: /etc/systemd/system/ollama.service.d └─override.conf Active: active (running) since Sat 2025-09-13 15:36:48 CEST; 5h 16min ago Main PID: 53246 (ollama) Tasks: 13 (limit: 76870) Memory: 25.1M (peak: 43.1M) CPU: 506ms CGroup: /system.slice/ollama.service └─53246 /usr/local/bin/ollama serve ``` How can I make use of the systemd ollama service?
Author
Owner

@rick-github commented on GitHub (Sep 13, 2025):

systemctl cat --no-pager ollama
<!-- gh-comment-id:3288737169 --> @rick-github commented on GitHub (Sep 13, 2025): ``` systemctl cat --no-pager ollama ```
Author
Owner

@gkzsolt commented on GitHub (Sep 13, 2025):

$ systemctl cat --no-pager ollama
# /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin"

[Install]
WantedBy=default.target

# /etc/systemd/system/ollama.service.d/override.conf
[Service]
Environment="OLLAMA_HOST=192.168.1.4"
<!-- gh-comment-id:3288739728 --> @gkzsolt commented on GitHub (Sep 13, 2025): ``` $ systemctl cat --no-pager ollama # /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin" [Install] WantedBy=default.target # /etc/systemd/system/ollama.service.d/override.conf [Service] Environment="OLLAMA_HOST=192.168.1.4" ```
Author
Owner

@rick-github commented on GitHub (Sep 13, 2025):

Environment="OLLAMA_HOST=0.0.0.0"
<!-- gh-comment-id:3288741160 --> @rick-github commented on GitHub (Sep 13, 2025): ``` Environment="OLLAMA_HOST=0.0.0.0" ```
Author
Owner

@gkzsolt commented on GitHub (Sep 13, 2025):

Thanks. After making Environment="OLLAMA_HOST=0.0.0.0", ollama list works and connects to the server by the systemd service. In the meantime I pulled some llm images and they are in my home dir ~/.ollama. Can I move them to ~ollama/.ollama (and do the owner updates)?

<!-- gh-comment-id:3288751992 --> @gkzsolt commented on GitHub (Sep 13, 2025): Thanks. After making `Environment="OLLAMA_HOST=0.0.0.0"`, `ollama list` works and connects to the server by the systemd service. In the meantime I pulled some llm images and they are in my home dir `~/.ollama`. Can I move them to `~ollama/.ollama` (and do the owner updates)?
Author
Owner

@rick-github commented on GitHub (Sep 13, 2025):

Can I move them to ~ollama/.ollama (and do the owner updates)?

Yes.

<!-- gh-comment-id:3288752885 --> @rick-github commented on GitHub (Sep 13, 2025): > Can I move them to ~ollama/.ollama (and do the owner updates)? Yes.
Author
Owner

@gkzsolt commented on GitHub (Sep 13, 2025):

Thanks a lot! You made my day :-)

<!-- gh-comment-id:3288755025 --> @gkzsolt commented on GitHub (Sep 13, 2025): Thanks a lot! You made my day :-)
Author
Owner

@AlexanderPershin commented on GitHub (Mar 6, 2026):

i've just tried to run gpt-oss latest model — it showed some python error. I tried to reinstall ollama ­— after new installation all the models disappeared… hope you'll fix this

<!-- gh-comment-id:4011141654 --> @AlexanderPershin commented on GitHub (Mar 6, 2026): i've just tried to run `gpt-oss` latest model — it showed some python error. I tried to reinstall ollama ­— after new installation all the models disappeared… hope you'll fix this
Author
Owner

@rick-github commented on GitHub (Mar 6, 2026):

it showed some python error

Ollama is not written in python, so this sounds like a client issue.

after new installation all the models disappeared… hope you'll fix this

If you didn.t manually delete the models, they are still there. What OS are you running?

<!-- gh-comment-id:4013586684 --> @rick-github commented on GitHub (Mar 6, 2026): > it showed some python error Ollama is not written in python, so this sounds like a client issue. > after new installation all the models disappeared… hope you'll fix this If you didn.t manually delete the models, they are still there. What OS are you running?
Author
Owner

@AlexanderPershin commented on GitHub (Mar 6, 2026):

It turned out there was ollama service running on the background, though ollama itself was deleted. This somehow lead to strange behavior. After getting rid of service, reinstalling ollama and missing model another models suddenly appeared in result of ollama list

<!-- gh-comment-id:4014803417 --> @AlexanderPershin commented on GitHub (Mar 6, 2026): It turned out there was ollama service running on the background, though ollama itself was deleted. This somehow lead to strange behavior. After getting rid of service, reinstalling ollama and missing model another models suddenly appeared in result of `ollama list`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#50706