[GH-ISSUE #1478] Models are sometimes lost #798

Closed
opened 2026-04-12 10:28:36 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @iplayfast on GitHub (Dec 12, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1478

This has happened before, and I don't know what causes it. (I'm using autogen and litellm)
ollama stalled, with my application and ended up with a timeout error.

I realized there were some defunct processes so I syscontrol stop ollama and killed every instance of ollama I could find.
Upon starting it again, it didn't have the models.

The last time it happened, I believe a reboot brought everything back.

Originally created by @iplayfast on GitHub (Dec 12, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1478 This has happened before, and I don't know what causes it. (I'm using autogen and litellm) ollama stalled, with my application and ended up with a timeout error. I realized there were some defunct processes so I syscontrol stop ollama and killed every instance of ollama I could find. Upon starting it again, it didn't have the models. The last time it happened, I believe a reboot brought everything back.
GiteaMirror added the bug label 2026-04-12 10:28:36 -05:00
Author
Owner

@BruceMacD commented on GitHub (Dec 12, 2023):

This is unexpected, thanks for reporting. It looks like you are on Linux. When Ollama fails with the timeout and you restart it are you starting ollama serve or the systemctl service? Ollama runs in different contexts between those two scenarios (as the current user when using ollama server and as an ollama user when run by systemctl). So it is possible it is looking at different model directories in those cases. Next time it happens take a if you're running ollama the same way when it is restarted.

As an aside, I'll try to reproduce the timeout.

<!-- gh-comment-id:1852434121 --> @BruceMacD commented on GitHub (Dec 12, 2023): This is unexpected, thanks for reporting. It looks like you are on Linux. When Ollama fails with the timeout and you restart it are you starting `ollama serve` or the systemctl service? Ollama runs in different contexts between those two scenarios (as the current user when using `ollama server` and as an `ollama` user when run by systemctl). So it is possible it is looking at different model directories in those cases. Next time it happens take a if you're running ollama the same way when it is restarted. As an aside, I'll try to reproduce the timeout.
Author
Owner

@iplayfast commented on GitHub (Dec 13, 2023):

I am on linux, I believe this time I used systemctl

<!-- gh-comment-id:1853428059 --> @iplayfast commented on GitHub (Dec 13, 2023): I am on linux, I believe this time I used systemctl
Author
Owner

@pdevine commented on GitHub (Mar 12, 2024):

If you switched between ollama serve and systemctl your models will definitely be in two separate places. You can use the OLLAMA_MODELS env variable to point to the correct place and your models should show up again.

I'm going to go ahead and close the issue, but feel free to keep commenting or reopen it if you're still having problems.

<!-- gh-comment-id:1989746645 --> @pdevine commented on GitHub (Mar 12, 2024): If you switched between `ollama serve` and `systemctl` your models will definitely be in two separate places. You can use the `OLLAMA_MODELS` env variable to point to the correct place and your models should show up again. I'm going to go ahead and close the issue, but feel free to keep commenting or reopen it if you're still having problems.
Author
Owner

@WindSmileValley commented on GitHub (May 8, 2024):

Sometimes lost also in Win

<!-- gh-comment-id:2099592747 --> @WindSmileValley commented on GitHub (May 8, 2024): Sometimes lost also in Win
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#798