[GH-ISSUE #8891] weights disappears after ollama serve #5765

Closed
opened 2026-04-12 17:05:40 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @maifeeulasad on GitHub (Feb 6, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8891

What is the issue?

>ollama list # initally
NAME                                   ID              SIZE      MODIFIED       
deepseek-r1:14b                        ea35dfe18182    9.0 GB    10 seconds ago    
llama2:7b                              78e26419b446    3.8 GB    3 days ago        
llava:7b                               8dd30f6b0cb1    4.7 GB    2 weeks ago       
phi4:14b-q4_K_M                        ac896e5b8b34    9.1 GB    2 weeks ago       
deepseek-r1:8b                         28f8fd6cdc67    4.9 GB    2 weeks ago       
deepseek-r1:1.5b                       a42b25d8c10a    1.1 GB    2 weeks ago       
phi3:3.8b-mini-128k-instruct-q4_K_M    5a157c52afff    2.4 GB    2 weeks ago       

>ollama list # after ollama serve and then it crashes when i run this command
NAME                                   ID              SIZE      MODIFIED    
deepseek-r1:1.5b                       a42b25d8c10a    1.1 GB    7 days ago     
phi3:3.8b-mini-128k-instruct-q4_K_M    5a157c52afff    2.4 GB    2 weeks ago    

>ollama list # after crash
Error: could not connect to ollama app, is it running? 

>ollama list # after `sudo systemctl stop ollama`
Error: could not connect to ollama app, is it running?

>ollama list # after `sudo systemctl start ollama` or system restart
NAME                                   ID              SIZE      MODIFIED       
deepseek-r1:14b                        ea35dfe18182    9.0 GB    10 seconds ago    
llama2:7b                              78e26419b446    3.8 GB    3 days ago        
llava:7b                               8dd30f6b0cb1    4.7 GB    2 weeks ago       
phi4:14b-q4_K_M                        ac896e5b8b34    9.1 GB    2 weeks ago       
deepseek-r1:8b                         28f8fd6cdc67    4.9 GB    2 weeks ago       
deepseek-r1:1.5b                       a42b25d8c10a    1.1 GB    2 weeks ago       
phi3:3.8b-mini-128k-instruct-q4_K_M    5a157c52afff    2.4 GB    2 weeks ago

Relevant log output


OS

Linux

GPU

Nvidia

CPU

AMD

Ollama version

0.5.4

Originally created by @maifeeulasad on GitHub (Feb 6, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8891 ### What is the issue? ``` >ollama list # initally NAME ID SIZE MODIFIED deepseek-r1:14b ea35dfe18182 9.0 GB 10 seconds ago llama2:7b 78e26419b446 3.8 GB 3 days ago llava:7b 8dd30f6b0cb1 4.7 GB 2 weeks ago phi4:14b-q4_K_M ac896e5b8b34 9.1 GB 2 weeks ago deepseek-r1:8b 28f8fd6cdc67 4.9 GB 2 weeks ago deepseek-r1:1.5b a42b25d8c10a 1.1 GB 2 weeks ago phi3:3.8b-mini-128k-instruct-q4_K_M 5a157c52afff 2.4 GB 2 weeks ago >ollama list # after ollama serve and then it crashes when i run this command NAME ID SIZE MODIFIED deepseek-r1:1.5b a42b25d8c10a 1.1 GB 7 days ago phi3:3.8b-mini-128k-instruct-q4_K_M 5a157c52afff 2.4 GB 2 weeks ago >ollama list # after crash Error: could not connect to ollama app, is it running? >ollama list # after `sudo systemctl stop ollama` Error: could not connect to ollama app, is it running? >ollama list # after `sudo systemctl start ollama` or system restart NAME ID SIZE MODIFIED deepseek-r1:14b ea35dfe18182 9.0 GB 10 seconds ago llama2:7b 78e26419b446 3.8 GB 3 days ago llava:7b 8dd30f6b0cb1 4.7 GB 2 weeks ago phi4:14b-q4_K_M ac896e5b8b34 9.1 GB 2 weeks ago deepseek-r1:8b 28f8fd6cdc67 4.9 GB 2 weeks ago deepseek-r1:1.5b a42b25d8c10a 1.1 GB 2 weeks ago phi3:3.8b-mini-128k-instruct-q4_K_M 5a157c52afff 2.4 GB 2 weeks ago ``` ### Relevant log output ```shell ``` ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.5.4
GiteaMirror added the bug label 2026-04-12 17:05:40 -05:00
Author
Owner

@rick-github commented on GitHub (Feb 6, 2025):

ollama serve is running as you and stores the models in ~/.ollama. sudo systemctl start ollama is running the server as the ollama user and stores the models in /usr/share/ollama/.ollama.

<!-- gh-comment-id:2640498720 --> @rick-github commented on GitHub (Feb 6, 2025): `ollama serve` is running as you and stores the models in `~/.ollama`. `sudo systemctl start ollama` is running the server as the `ollama` user and stores the models in `/usr/share/ollama/.ollama`.
Author
Owner

@pdevine commented on GitHub (Feb 8, 2025):

As @rick-github mentioned, it looks like you're just running two different copies of the ollama server, which are getting the models from two different places. I'll go ahead and close the issue.

<!-- gh-comment-id:2644394076 --> @pdevine commented on GitHub (Feb 8, 2025): As @rick-github mentioned, it looks like you're just running two different copies of the ollama server, which are getting the models from two different places. I'll go ahead and close the issue.
Author
Owner

@maifeeulasad commented on GitHub (Feb 8, 2025):

Understood. Any method to pass these weights in between to reduce network load?

<!-- gh-comment-id:2645829087 --> @maifeeulasad commented on GitHub (Feb 8, 2025): Understood. Any method to pass these weights in between to reduce network load?
Author
Owner

@rick-github commented on GitHub (Feb 8, 2025):

OLLAMA_MODELS=/use/share/ollama/.ollama/models ollama serve

Or create a symlink, or use a bind mount.

<!-- gh-comment-id:2645861490 --> @rick-github commented on GitHub (Feb 8, 2025): ``` OLLAMA_MODELS=/use/share/ollama/.ollama/models ollama serve ``` Or create a symlink, or use a [bind mount](https://github.com/ollama/ollama/issues/8512#issuecomment-2605177997).
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5765