[GH-ISSUE #12537] ollama pull <model> ignores folder settings, environment variables, etc. #70378

Closed
opened 2026-05-04 21:19:23 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @SkybuckFlying on GitHub (Oct 8, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12537

What is the issue?

It simply downloads to C: etc.... instead of D: or H: as told too by environment variables or GUI settings.

ollama serve uses environment settings
while ollama gui uses a different settings.

Perhaps stream line the whole thing... kinda annoying all these pathing/setting issues...

(Also being able to set multiple drives for models would be nice...
currently using two ollama_models environment variables and switching between them _models1 vs _models2 vs _models, etc.

also kinda annoying how ollama list stays resident... just need it to show model names for ollama serve, after <- for loading models into ollama run, which is necessary for using with opencode and any other tools using http, somebody seemed to believe this was not necessary not sure what he is on about ? can opencode work with ollama without "serve/http" ? not sure...

need to shutdown ollama first via system tray on windows 11, before ollama serve can be run, otherwise tcp error already in use... wastes some time going through all these sheninangs.

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @SkybuckFlying on GitHub (Oct 8, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12537 ### What is the issue? It simply downloads to C: etc.... instead of D: or H: as told too by environment variables or GUI settings. ollama serve uses environment settings while ollama gui uses a different settings. Perhaps stream line the whole thing... kinda annoying all these pathing/setting issues... (Also being able to set multiple drives for models would be nice... currently using two ollama_models environment variables and switching between them _models1 vs _models2 vs _models, etc. also kinda annoying how ollama list stays resident... just need it to show model names for ollama serve, after <- for loading models into ollama run, which is necessary for using with opencode and any other tools using http, somebody seemed to believe this was not necessary not sure what he is on about ? can opencode work with ollama without "serve/http" ? not sure... need to shutdown ollama first via system tray on windows 11, before ollama serve can be run, otherwise tcp error already in use... wastes some time going through all these sheninangs. ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the questionneeds more info labels 2026-05-04 21:19:23 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 8, 2025):

Environment variables affect the server. The client, ie ollama pull, just sends requests to the server. The server then uses the environment variables settings to decide where to download models.

ollama list is another invocation of the client. On Windows, if the server is not running, it will be started by the client. So the resident ollama process is the server, not ollama list.

Don't run ollama serve manually. The ollama architecture is server/client. The server is started when the olllama app is run, usually when the user logs in. The server continues to run, accepting command from the ollama client (ollama list, ollama pull, ollama create, etc). There is no need to stop the server and then start it manually.

<!-- gh-comment-id:3381155008 --> @rick-github commented on GitHub (Oct 8, 2025): Environment variables affect the server. The client, ie `ollama pull`, just sends requests to the server. The server then uses the environment variables settings to decide where to download models. `ollama list` is another invocation of the client. On Windows, if the server is not running, it will be started by the client. So the resident ollama process is the server, not `ollama list`. Don't run `ollama serve` manually. The ollama architecture is server/client. The server is started when the olllama app is run, usually when the user logs in. The server continues to run, accepting command from the ollama client (`ollama list`, `ollama pull`, `ollama create`, etc). There is no need to stop the server and then start it manually.
Author
Owner

@SkybuckFlying commented on GitHub (Oct 8, 2025):

*** UPDATE ***:

It does seem to download to D: whatever I set it too.

However the console shows C:\

So some strange programming error.

<!-- gh-comment-id:3381322822 --> @SkybuckFlying commented on GitHub (Oct 8, 2025): *** UPDATE ***: It does seem to download to D: whatever I set it too. However the console shows C:\ So some strange programming error.
Author
Owner

@rick-github commented on GitHub (Oct 8, 2025):

You have to set the variables in the server environment and then restart the server for it to load the changed configuration.

<!-- gh-comment-id:3381331795 --> @rick-github commented on GitHub (Oct 8, 2025): You have to set the variables in the server environment and then restart the server for it to load the changed configuration.
Author
Owner
<!-- gh-comment-id:3381333765 --> @rick-github commented on GitHub (Oct 8, 2025): https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-windows
Author
Owner

@pdevine commented on GitHub (Oct 9, 2025):

@SkybuckFlying let us know if you have figured it out. You just need to set the OLLAMA_MODELS environment variable correctly following the docs that @rick-github sent, or in the app you can set it there too.

<!-- gh-comment-id:3383746654 --> @pdevine commented on GitHub (Oct 9, 2025): @SkybuckFlying let us know if you have figured it out. You just need to set the `OLLAMA_MODELS` environment variable correctly following the docs that @rick-github sent, or in the app you can set it there too.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#70378