[GH-ISSUE #6634] model reconfigure instructions: bug #2147 does not work #4175

Closed
opened 2026-04-12 15:06:14 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @vap0rtranz on GitHub (Sep 4, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6634

What is the issue?

Reconfiguring model directory to an external drive makes the service fail with permissions error. Workarounds in bug #2147 do not fix this.

Fixes I tried:

  • chown models dir
  • chown ../models dir
  • chown external drive mountpoint
  • chmod 750 models dir
  • chmod g+w ../models dir
  • chmod o+w external drive mountpoint

Changing ownership/permissions at the parent directory of models dir doesn't fix this issue, nor the model directory, nor perms for the mountpoint of the external drive.

This is mind boggling for how Linux services/daemons have worked for ... well a dang long time.

Why was the service developed to demand create permissions for a directory several levels above its configured working directories?!?!

Just silently move on if the directory structure exists. Or if the service cannot create a file in its working directories, or whatever directory, then there is a problem. An simple file create test would be all that's needed, like to put a new model in.

Error:

Sep 04 10:06:29 justin-two-towers systemd[1]: Started Ollama Service.
Sep 04 10:06:29 justin-two-towers ollama[1025227]: 2024/09/04 10:06:29 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/media/justin/external/CodeReady/LLM-Models/ollama-models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
Sep 04 10:06:29 justin-two-towers ollama[1025227]: Error: mkdir /media/justin/external: permission denied
Sep 04 10:06:29 justin-two-towers systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
Sep 04 10:06:29 justin-two-towers systemd[1]: ollama.service: Failed with result 'exit-code'.

Permissions Changes

$ ls -l /media/justin/
total 10
drwxrwxrwx 15 justin ollama 4096 Aug 31 13:57 external

$ ls -l /media/justin/external/CodeReady/LLM-Models/ | grep ollama-models
drwxrwxrwx 4 ollama ollama        4096 Sep  4 09:01 ollama-models

$ ls -l .. | grep LLM
drwxrwxr-x   3 justin ollama  4096 Sep  4 09:44 LLM-Models

And my service config:

### Editing /etc/systemd/system/ollama.service.d/override.conf
### Anything between here and the comment below will become the new contents of the file

[Service]
Environment="OLLAMA_MODELS=/media/justin/external/CodeReady/LLM-Models/ollama-models"

OS

Linux

GPU

No response

CPU

No response

Ollama version

0.3.9

Originally created by @vap0rtranz on GitHub (Sep 4, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6634 ### What is the issue? Reconfiguring model directory to an external drive makes the service fail with permissions error. Workarounds in [bug #2147](https://github.com/ollama/ollama/issues/2147) do not fix this. Fixes I tried: - chown models dir - chown ../models dir - chown external drive mountpoint - chmod 750 models dir - chmod g+w ../models dir - chmod o+w external drive mountpoint - Changing ownership/permissions at the parent directory of models dir doesn't fix this issue, nor the model directory, nor perms for the mountpoint of the external drive. This is mind boggling for how Linux services/daemons have worked for ... well a dang long time. Why was the service developed to demand create permissions for a directory several levels above its configured working directories?!?! Just silently move on if the directory structure exists. Or if the service cannot create a file in its working directories, or whatever directory, then there is a problem. An simple file create test would be all that's needed, like to put a new model in. Error: ``` Sep 04 10:06:29 justin-two-towers systemd[1]: Started Ollama Service. Sep 04 10:06:29 justin-two-towers ollama[1025227]: 2024/09/04 10:06:29 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/media/justin/external/CodeReady/LLM-Models/ollama-models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]" Sep 04 10:06:29 justin-two-towers ollama[1025227]: Error: mkdir /media/justin/external: permission denied Sep 04 10:06:29 justin-two-towers systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE Sep 04 10:06:29 justin-two-towers systemd[1]: ollama.service: Failed with result 'exit-code'. ``` Permissions Changes ``` $ ls -l /media/justin/ total 10 drwxrwxrwx 15 justin ollama 4096 Aug 31 13:57 external $ ls -l /media/justin/external/CodeReady/LLM-Models/ | grep ollama-models drwxrwxrwx 4 ollama ollama 4096 Sep 4 09:01 ollama-models $ ls -l .. | grep LLM drwxrwxr-x 3 justin ollama 4096 Sep 4 09:44 LLM-Models ``` And my service config: ``` ### Editing /etc/systemd/system/ollama.service.d/override.conf ### Anything between here and the comment below will become the new contents of the file [Service] Environment="OLLAMA_MODELS=/media/justin/external/CodeReady/LLM-Models/ollama-models" ``` ### OS Linux ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.3.9
GiteaMirror added the bug label 2026-04-12 15:06:14 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 4, 2024):

What's the result of ls -ld /media/justin? What's the filesystem type on external?

$ ollama -v
ollama version is 0.3.9
$ sudo systemctl stop ollama.service 
$ sudo systemctl cat ollama.service | grep MODEL
Environment="OLLAMA_MODELS=/media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models"
$ sudo rm -rf /media/rick/EXTERNAL/CodeReady
$ mkdir -m 777 /media/rick/EXTERNAL/CodeReady
$ find /media/rick/ -printf "%-7u %m %p\n"
root    751 /media/rick/
rick    755 /media/rick/EXTERNAL
rick    777 /media/rick/EXTERNAL/CodeReady
$ sudo systemctl start ollama.service
$ find /media/rick/ -printf "%-7u %m %p\n"
root    751 /media/rick/
rick    755 /media/rick/EXTERNAL
rick    777 /media/rick/EXTERNAL/CodeReady
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs
$ ollama list
NAME	ID	SIZE	MODIFIED 
$ ollama pull qwen2:0.5b
pulling manifest 
pulling 8de95da68dc4... 100% ▕███████████▏ 352 MB                         
pulling 62fbfd9ed093... 100% ▕███████████▏  182 B                         
pulling c156170b718e... 100% ▕███████████▏  11 KB                         
pulling f02dd72bb242... 100% ▕███████████▏   59 B                         
pulling 2184ab82477b... 100% ▕███████████▏  488 B                         
verifying sha256 digest 
writing manifest 
success 
$ ollama list
NAME      	ID          	SIZE  	MODIFIED       
qwen2:0.5b	6f48b936a09f	352 MB	13 seconds ago	
$ find /media/rick/ -printf "%-7u %m %p\n"
root    751 /media/rick/
rick    755 /media/rick/EXTERNAL
rick    777 /media/rick/EXTERNAL/CodeReady
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests/registry.ollama.ai
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests/registry.ollama.ai/library
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests/registry.ollama.ai/library/qwen2
ollama  644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests/registry.ollama.ai/library/qwen2/0.5b
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs
ollama  644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs/sha256-62fbfd9ed093d6e5ac83190c86eec5369317919f4b149598d2dbb38900e9faef
ollama  644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs/sha256-2184ab82477bc33a5e08fa209df88f0631a19e686320cce2cfe9e00695b2f0e6
ollama  644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs/sha256-8de95da68dc485c0889c205384c24642f83ca18d089559c977ffc6a3972a71a8
ollama  644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs/sha256-f02dd72bb2423204352eabc5637b44d79d17f109fdb510a7c51455892aa2d216
ollama  644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs/sha256-c156170b718ec29139d3653d40ed1986fd92fb7e0959b5c71f3c48f62e6636f4
# stop ollama, unplug and replug the device
$ ollama list
NAME      	ID          	SIZE  	MODIFIED      
qwen2:0.5b	6f48b936a09f	352 MB	4 minutes ago	
$ find /media/rick/ -maxdepth 5 -printf "%-7u %m %p\n"
root    751 /media/rick/
rick    755 /media/rick/EXTERNAL
rick    777 /media/rick/EXTERNAL/CodeReady
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests
ollama  755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs
$ ollama run qwen2:0.5b
>>> hello
Hello! How can I assist you today?
<!-- gh-comment-id:2329555030 --> @rick-github commented on GitHub (Sep 4, 2024): What's the result of `ls -ld /media/justin`? What's the filesystem type on `external`? ``` $ ollama -v ollama version is 0.3.9 $ sudo systemctl stop ollama.service $ sudo systemctl cat ollama.service | grep MODEL Environment="OLLAMA_MODELS=/media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models" $ sudo rm -rf /media/rick/EXTERNAL/CodeReady $ mkdir -m 777 /media/rick/EXTERNAL/CodeReady $ find /media/rick/ -printf "%-7u %m %p\n" root 751 /media/rick/ rick 755 /media/rick/EXTERNAL rick 777 /media/rick/EXTERNAL/CodeReady $ sudo systemctl start ollama.service $ find /media/rick/ -printf "%-7u %m %p\n" root 751 /media/rick/ rick 755 /media/rick/EXTERNAL rick 777 /media/rick/EXTERNAL/CodeReady ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs $ ollama list NAME ID SIZE MODIFIED $ ollama pull qwen2:0.5b pulling manifest pulling 8de95da68dc4... 100% ▕███████████▏ 352 MB pulling 62fbfd9ed093... 100% ▕███████████▏ 182 B pulling c156170b718e... 100% ▕███████████▏ 11 KB pulling f02dd72bb242... 100% ▕███████████▏ 59 B pulling 2184ab82477b... 100% ▕███████████▏ 488 B verifying sha256 digest writing manifest success $ ollama list NAME ID SIZE MODIFIED qwen2:0.5b 6f48b936a09f 352 MB 13 seconds ago $ find /media/rick/ -printf "%-7u %m %p\n" root 751 /media/rick/ rick 755 /media/rick/EXTERNAL rick 777 /media/rick/EXTERNAL/CodeReady ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests/registry.ollama.ai ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests/registry.ollama.ai/library ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests/registry.ollama.ai/library/qwen2 ollama 644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests/registry.ollama.ai/library/qwen2/0.5b ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs ollama 644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs/sha256-62fbfd9ed093d6e5ac83190c86eec5369317919f4b149598d2dbb38900e9faef ollama 644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs/sha256-2184ab82477bc33a5e08fa209df88f0631a19e686320cce2cfe9e00695b2f0e6 ollama 644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs/sha256-8de95da68dc485c0889c205384c24642f83ca18d089559c977ffc6a3972a71a8 ollama 644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs/sha256-f02dd72bb2423204352eabc5637b44d79d17f109fdb510a7c51455892aa2d216 ollama 644 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs/sha256-c156170b718ec29139d3653d40ed1986fd92fb7e0959b5c71f3c48f62e6636f4 # stop ollama, unplug and replug the device $ ollama list NAME ID SIZE MODIFIED qwen2:0.5b 6f48b936a09f 352 MB 4 minutes ago $ find /media/rick/ -maxdepth 5 -printf "%-7u %m %p\n" root 751 /media/rick/ rick 755 /media/rick/EXTERNAL rick 777 /media/rick/EXTERNAL/CodeReady ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/manifests ollama 755 /media/rick/EXTERNAL/CodeReady/LLM-Models/ollama-models/blobs $ ollama run qwen2:0.5b >>> hello Hello! How can I assist you today? ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4175