[GH-ISSUE #11312] Does anyone know why I need to download the model again after using the startup service? #7463

Closed
opened 2026-04-12 19:32:08 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @mkinit on GitHub (Jul 6, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11312

Why do I need to download the model to the C drive again after using the startup service? I have already changed the model path in the settings to the F drive.

If the service is not started, the model will not be downloaded again.

Originally created by @mkinit on GitHub (Jul 6, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11312 Why do I need to download the model to the C drive again after using the startup service? I have already changed the model path in the settings to the F drive. If the service is not started, the model will not be downloaded again.
Author
Owner

@rick-github commented on GitHub (Jul 6, 2025):

The change to OLLAMA_MODELS has to be in the environment of the server. If you could provide details of your configuration it would be easier to to answer the question.

<!-- gh-comment-id:3042009458 --> @rick-github commented on GitHub (Jul 6, 2025): The change to `OLLAMA_MODELS` has to be in the environment of the server. If you could provide details of your configuration it would be easier to to answer the question.
Author
Owner

@mkinit commented on GitHub (Jul 6, 2025):

The change to OLLAMA_MODELS has to be in the environment of the server. If you could provide details of your configuration it would be easier to to answer the question.

I'm using Windows 10 and installed it using the exe installation package. The model location in the graphical interface settings was changed to F drive.
Because I want to use "chatbox" to connect to the model downloaded from ollama, I tried to start the service using "ollama serve".

<!-- gh-comment-id:3042047595 --> @mkinit commented on GitHub (Jul 6, 2025): > The change to `OLLAMA_MODELS` has to be in the environment of the server. If you could provide details of your configuration it would be easier to to answer the question. I'm using Windows 10 and installed it using the exe installation package. The model location in the graphical interface settings was changed to F drive. Because I want to use "chatbox" to connect to the model downloaded from ollama, I tried to start the service using "ollama serve".
Author
Owner

@rick-github commented on GitHub (Jul 6, 2025):

After quitting ollama from the systray, try starting it from the toolbar - type ollama in the search box and clicking on the Ollama app. Running ollama serve in a terminal window may not inherit the variables from the sqlite configuration file.

<!-- gh-comment-id:3042158965 --> @rick-github commented on GitHub (Jul 6, 2025): After quitting ollama from the systray, try starting it from the toolbar - type ollama in the search box and clicking on the Ollama app. Running `ollama serve` in a terminal window may not inherit the variables from the sqlite configuration file.
Author
Owner

@mkinit commented on GitHub (Jul 6, 2025):

After quitting ollama from the systray, try starting it from the toolbar - type ollama in the search box and clicking on the Ollama app. Running ollama serve in a terminal window may not inherit the variables from the sqlite configuration file.

I now understand that using "ollama serve" and starting with the GUI use different configurations, thank you.

<!-- gh-comment-id:3042175404 --> @mkinit commented on GitHub (Jul 6, 2025): > After quitting ollama from the systray, try starting it from the toolbar - type ollama in the search box and clicking on the Ollama app. Running `ollama serve` in a terminal window may not inherit the variables from the sqlite configuration file. I now understand that using "ollama serve" and starting with the GUI use different configurations, thank you.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7463