[GH-ISSUE #1296] All models gone? #669

Closed
opened 2026-04-12 10:21:23 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @iplayfast on GitHub (Nov 28, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1296

I have no idea what happened. Started working ran
ollama run alfred
Error: could not connect to ollama server, run 'ollama serve' to start it
(alfred was previously installed)

ollama serve &
ollama run alfred
started downloading it!

Olama list
all the models are gone.
in /usr/share/ollama/.ollama/models/blobs there are a lot of files some are large. so I think that's them.

but ollama doesn't know about them.

Originally created by @iplayfast on GitHub (Nov 28, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1296 I have no idea what happened. Started working ran ollama run alfred Error: could not connect to ollama server, run 'ollama serve' to start it (alfred was previously installed) ollama serve & ollama run alfred started downloading it! Olama list all the models are gone. in /usr/share/ollama/.ollama/models/blobs there are a lot of files some are large. so I think that's them. but ollama doesn't know about them.
Author
Owner

@iplayfast commented on GitHub (Nov 28, 2023):

exited all bash prompts, and ran systemctl ollama serve and all the models are back.
very weird.

<!-- gh-comment-id:1829140298 --> @iplayfast commented on GitHub (Nov 28, 2023): exited all bash prompts, and ran systemctl ollama serve and all the models are back. very weird.
Author
Owner

@technovangelist commented on GitHub (Nov 28, 2023):

In the FAQ under docs in the repo is a look at how we store models. Also, based on your description you were running as two different users. The systemctl command runs ollama as the user ollama, but running ollama serve runs ollama as you. And the ollama run as you knows nothing about the models downloaded by the user ollama. Based on your description, it seems to be working as expected. I'll close this issue. Thanks so much for being a great Ollama user.

<!-- gh-comment-id:1830012891 --> @technovangelist commented on GitHub (Nov 28, 2023): In the FAQ under docs in the repo is a look at how we store models. Also, based on your description you were running as two different users. The systemctl command runs ollama as the user ollama, but running ollama serve runs ollama as you. And the ollama run as you knows nothing about the models downloaded by the user ollama. Based on your description, it seems to be working as expected. I'll close this issue. Thanks so much for being a great Ollama user.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#669