[GH-ISSUE #9889] OLLAMA_MODELS directive not respected (Windows) #68532

Closed
opened 2026-05-04 14:19:14 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @thomasw-mitutoyo-ctl on GitHub (Mar 19, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9889

What is the issue?

I have moved my Ollama models from my system disk (user home directory) to a different internal drive and modified the %OLLAMA_MODELS% environment variable accordingly.

I have tried:

  • setting %OLLAMA_MODELS% as a user environment variable
  • setting %OLLAMA_MODELS% as a system environment variable
  • quit from the Ollama tray icon
  • updating ollama to the latest version

This issue is similar to #9877 but I'm on Windows, so the commands and procedures are a bit different.

Relevant log output

Here's the output of a command line session which shows that the environment variable is set and points to a valid directory:


C:\>set OLLAMA
OLLAMA_MODELS=E:\OllamaModels

C:\>cd /d %OLLAMA_MODELS%

E:\OllamaModels>dir
 Volume in drive E is temp
 Volume Serial Number is 4A6A-C2DF

 Directory of E:\OllamaModels

2025-03-19  11:38    <DIR>          .
2025-03-19  11:38    <DIR>          ..
2025-03-17  13:52    <DIR>          blobs
2025-03-05  09:35             1.865 history
2025-01-27  07:49               387 id_ed25519
2025-01-27  07:49                81 id_ed25519.pub
2025-03-19  11:38    <DIR>          manifests
2025-01-27  13:38    <DIR>          models
               3 File(s)          2.333 bytes
               5 Dir(s)  345.896.026.112 bytes free

E:\OllamaModels>ollama list
NAME    ID    SIZE    MODIFIED


The debug log contains personal and/or company internal information, so I will not attach the full log.

I see that the environment variable is picked up:

2025/03/19 11:46:41 routes.go:1230: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:2048 OLLAMA_DEBUG:true OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:E:\\OllamaModels OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:]"

OS

Windows

GPU

No response

CPU

Intel

Ollama version

No response

Originally created by @thomasw-mitutoyo-ctl on GitHub (Mar 19, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9889 ### What is the issue? I have moved my Ollama models from my system disk (user home directory) to a different internal drive and modified the `%OLLAMA_MODELS%` environment variable accordingly. I have tried: * setting `%OLLAMA_MODELS%` as a user environment variable * setting `%OLLAMA_MODELS%` as a system environment variable * quit from the Ollama tray icon * updating ollama to the latest version This issue is similar to #9877 but I'm on Windows, so the commands and procedures are a bit different. ### Relevant log output ```shell Here's the output of a command line session which shows that the environment variable is set and points to a valid directory: C:\>set OLLAMA OLLAMA_MODELS=E:\OllamaModels C:\>cd /d %OLLAMA_MODELS% E:\OllamaModels>dir Volume in drive E is temp Volume Serial Number is 4A6A-C2DF Directory of E:\OllamaModels 2025-03-19 11:38 <DIR> . 2025-03-19 11:38 <DIR> .. 2025-03-17 13:52 <DIR> blobs 2025-03-05 09:35 1.865 history 2025-01-27 07:49 387 id_ed25519 2025-01-27 07:49 81 id_ed25519.pub 2025-03-19 11:38 <DIR> manifests 2025-01-27 13:38 <DIR> models 3 File(s) 2.333 bytes 5 Dir(s) 345.896.026.112 bytes free E:\OllamaModels>ollama list NAME ID SIZE MODIFIED The debug log contains personal and/or company internal information, so I will not attach the full log. I see that the environment variable is picked up: 2025/03/19 11:46:41 routes.go:1230: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:2048 OLLAMA_DEBUG:true OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:E:\\OllamaModels OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:]" ``` ### OS Windows ### GPU _No response_ ### CPU Intel ### Ollama version _No response_
GiteaMirror added the bug label 2026-05-04 14:19:14 -05:00
Author
Owner

@thomasw-mitutoyo-ctl commented on GitHub (Mar 19, 2025):

The models directory is incorrect. From the output of dir, your directory should be E:\OllamaModels\models.

<!-- gh-comment-id:2736185447 --> @thomasw-mitutoyo-ctl commented on GitHub (Mar 19, 2025): The models directory is incorrect. From the output of `dir`, your directory should be `E:\OllamaModels\models`.
Author
Owner

@DardanIsufi95 commented on GitHub (Mar 1, 2026):

I ran into the same issue on version 0.17.4 (Windows 10.0.19045). Setting the OLLAMA_MODELS environment variable (both as a system variable and as a user variable) did not work for me.

What finally solved it was manually updating the configuration directly in the SQLite database:

%USERPROFILE%\AppData\Local\Ollama\db.sqlite

I modified the relevant entry in the settings table to set the correct models path there. After updating the value and restarting Ollama, everything worked as expected.

Posting this in case anyone else is stuck with OLLAMA_MODELS not being respected on Windows.

<!-- gh-comment-id:3979854298 --> @DardanIsufi95 commented on GitHub (Mar 1, 2026): I ran into the same issue on **version 0.17.4** (Windows 10.0.19045). Setting the `OLLAMA_MODELS` environment variable (both as a **system variable** and as a **user variable**) did not work for me. What finally solved it was manually updating the configuration directly in the SQLite database: ``` %USERPROFILE%\AppData\Local\Ollama\db.sqlite ``` I modified the relevant entry in the `settings` table to set the correct models path there. After updating the value and restarting Ollama, everything worked as expected. Posting this in case anyone else is stuck with `OLLAMA_MODELS` not being respected on Windows.
Author
Owner

@Mikt25 commented on GitHub (Mar 14, 2026):

I know this is old but it might help someone who made the mistake I did. Make sure after changing your path to E:\OllamaModels\models that you also completely move or delete the old models directory or the env variable won't be respected.
I'm on version 0.18.

<!-- gh-comment-id:4059514688 --> @Mikt25 commented on GitHub (Mar 14, 2026): I know this is old but it might help someone who made the mistake I did. Make sure after changing your path to `E:\OllamaModels\models` that you also completely move or delete the old models directory or the env variable won't be respected. I'm on version 0.18.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68532