[GH-ISSUE #10019] mkdir /mnt/LLM/ollama: permission denied, when trying to change the model location #68627

Closed
opened 2026-05-04 14:38:58 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @fanlessfan on GitHub (Mar 27, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10019

What is the issue?

mkdir /mnt/LLM/ollama: permission denied. I want to move the model to an NFS share from local drive. in the local drive I can create a symbolic link, but once I move the models to the NFS share the symbolic link is not working. it always try to mkdir but failed even though it has permission. I also tried to use OLLAMA_MODELS and got the same error. But you can see that the dir has 777 permission and owned by ollama and I can touch a file. the ollama.0 is the old ollama folder.

mkdir /mnt/LLM/ollama: permission denied

journalctl -r -u ollama

Mar 27 15:49:20 x11dph systemd[1]: ollama.service: Failed with result 'exit-code'.
Mar 27 15:49:20 x11dph systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
Mar 27 15:49:20 x11dph ollama[18437]: Error: mkdir /mnt/LLM/ollama: permission denied
Mar 27 15:49:20 x11dph ollama[18437]: 2025/03/27 15:49:20 routes.go:1230: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_V>
Mar 27 15:49:20 x11dph systemd[1]: Started ollama.service - Ollama Service.

uxadm@x11dph:/mnt/LLM$ touch a
uxadm@x11dph:/mnt/LLM$ ls -al
total 4
drwxrwxrwx 1 ollama ollama 30 Mar 27 15:46 .
drwxr-xr-x 5 root root 4096 Mar 27 13:14 ..
-rwxrwxrwx 1 uxadm uxadm 0 Mar 27 15:46 a
drwxr-x--- 1 ollama ollama 68 Mar 24 11:20 ollama.0

cat /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
Environment="OLLAMA_KEEP_ALIVE=-1"
Environment="OLLAMA_MODELS=/mnt/LLM/ollama/.ollama/models"

[Install]
WantedBy=default.target

Relevant log output

journalctl -r -u ollama

Mar 27 15:49:20 x11dph systemd[1]: ollama.service: Failed with result 'exit-code'.
Mar 27 15:49:20 x11dph systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
Mar 27 15:49:20 x11dph ollama[18437]: Error: mkdir /mnt/LLM/ollama: permission denied
Mar 27 15:49:20 x11dph ollama[18437]: 2025/03/27 15:49:20 routes.go:1230: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_V>
Mar 27 15:49:20 x11dph systemd[1]: Started ollama.service - Ollama Service.

OS

Linux

GPU

No response

CPU

Intel

Ollama version

0.6.2

Originally created by @fanlessfan on GitHub (Mar 27, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10019 ### What is the issue? mkdir /mnt/LLM/ollama: permission denied. I want to move the model to an NFS share from local drive. in the local drive I can create a symbolic link, but once I move the models to the NFS share the symbolic link is not working. it always try to mkdir but failed even though it has permission. I also tried to use OLLAMA_MODELS and got the same error. But you can see that the dir has 777 permission and owned by ollama and I can touch a file. the ollama.0 is the old ollama folder. mkdir /mnt/LLM/ollama: permission denied journalctl -r -u ollama Mar 27 15:49:20 x11dph systemd[1]: ollama.service: Failed with result 'exit-code'. Mar 27 15:49:20 x11dph systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE Mar 27 15:49:20 x11dph ollama[18437]: Error: mkdir /mnt/LLM/ollama: permission denied Mar 27 15:49:20 x11dph ollama[18437]: 2025/03/27 15:49:20 routes.go:1230: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_V> Mar 27 15:49:20 x11dph systemd[1]: Started ollama.service - Ollama Service. uxadm@x11dph:/mnt/LLM$ touch a uxadm@x11dph:/mnt/LLM$ ls -al total 4 drwxrwxrwx 1 ollama ollama 30 Mar 27 15:46 . drwxr-xr-x 5 root root 4096 Mar 27 13:14 .. -rwxrwxrwx 1 uxadm uxadm 0 Mar 27 15:46 a drwxr-x--- 1 ollama ollama 68 Mar 24 11:20 ollama.0 cat /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin" Environment="OLLAMA_KEEP_ALIVE=-1" Environment="OLLAMA_MODELS=/mnt/LLM/ollama/.ollama/models" [Install] WantedBy=default.target ### Relevant log output ```shell journalctl -r -u ollama Mar 27 15:49:20 x11dph systemd[1]: ollama.service: Failed with result 'exit-code'. Mar 27 15:49:20 x11dph systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE Mar 27 15:49:20 x11dph ollama[18437]: Error: mkdir /mnt/LLM/ollama: permission denied Mar 27 15:49:20 x11dph ollama[18437]: 2025/03/27 15:49:20 routes.go:1230: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_V> Mar 27 15:49:20 x11dph systemd[1]: Started ollama.service - Ollama Service. ``` ### OS Linux ### GPU _No response_ ### CPU Intel ### Ollama version 0.6.2
GiteaMirror added the bug label 2026-05-04 14:38:58 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 27, 2025):

Please use --no-pager when you run journalctl, it preserves the right hand side of the log.

What's the output of:

mount | grep LLM
systemctl cat ollama --no-pager
p="/mnt/LLM/ollama/.ollama/models"; ls -ld / $(while [ "$p" != "/" ]; do echo "$p"; p=$(dirname "$p"); done | tac)

What parameters is the NFS file system exported with?

<!-- gh-comment-id:2759467081 --> @rick-github commented on GitHub (Mar 27, 2025): Please use `--no-pager` when you run `journalctl`, it preserves the right hand side of the log. What's the output of: ``` mount | grep LLM systemctl cat ollama --no-pager p="/mnt/LLM/ollama/.ollama/models"; ls -ld / $(while [ "$p" != "/" ]; do echo "$p"; p=$(dirname "$p"); done | tac) ``` What parameters is the NFS file system exported with?
Author
Owner

@fanlessfan commented on GitHub (Mar 27, 2025):

Hi @rick-github,

Thank you for the quick response.

here are the output. thx

showmount -e NAS3
Export list for NAS3:
/volume5/LLM *
/volume5/DATA3 *
/volume5/SCAN *
/volume4/Movies *

NAS3:/volume5/LLM on /mnt/LLM type nfs (rw,relatime,vers=3,rsize=131072,wsize=131072,namlen=255,hard,proto=tcp,timeo=600,retrans=2,sec=sys,mountaddr=192.168.20.13,mountvers=3,mountport=892,mountproto=udp,local_lock=none,addr=192.168.20.13)

/etc/systemd/system/ollama.service

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
Environment="OLLAMA_KEEP_ALIVE=-1"
Environment="OLLAMA_MODELS=/mnt/LLM/ollama/.ollama/models"

[Install]
WantedBy=default.target
drwxr-xr-x 24 root root 4096 Mar 26 11:11 /
drwxr-xr-x 5 root root 4096 Mar 27 13:14 /mnt
drwxrwxrwx 1 ollama ollama 24 Mar 27 16:20 /mnt/LLM
drwxrwxrwx 1 ollama ollama 68 Mar 24 11:20 /mnt/LLM/ollama
drwxrwxrwx 1 ollama ollama 60 Mar 24 11:20 /mnt/LLM/ollama/.ollama
drwxrwxrwx 1 ollama ollama 28 Mar 24 11:22 /mnt/LLM/ollama/.ollama/models

<!-- gh-comment-id:2759486916 --> @fanlessfan commented on GitHub (Mar 27, 2025): Hi @rick-github, Thank you for the quick response. here are the output. thx showmount -e NAS3 Export list for NAS3: /volume5/LLM * /volume5/DATA3 * /volume5/SCAN * /volume4/Movies * NAS3:/volume5/LLM on /mnt/LLM type nfs (rw,relatime,vers=3,rsize=131072,wsize=131072,namlen=255,hard,proto=tcp,timeo=600,retrans=2,sec=sys,mountaddr=192.168.20.13,mountvers=3,mountport=892,mountproto=udp,local_lock=none,addr=192.168.20.13) # /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin" Environment="OLLAMA_KEEP_ALIVE=-1" Environment="OLLAMA_MODELS=/mnt/LLM/ollama/.ollama/models" [Install] WantedBy=default.target drwxr-xr-x 24 root root 4096 Mar 26 11:11 / drwxr-xr-x 5 root root 4096 Mar 27 13:14 /mnt drwxrwxrwx 1 ollama ollama 24 Mar 27 16:20 /mnt/LLM drwxrwxrwx 1 ollama ollama 68 Mar 24 11:20 /mnt/LLM/ollama drwxrwxrwx 1 ollama ollama 60 Mar 24 11:20 /mnt/LLM/ollama/.ollama drwxrwxrwx 1 ollama ollama 28 Mar 24 11:22 /mnt/LLM/ollama/.ollama/models
Author
Owner

@rick-github commented on GitHub (Mar 27, 2025):

The full path is there, was that manually created or a previous ollama run that succeeded?

What's the result of

sudo -u ollama touch /mnt/LLM/touch.test ; ls -al /mnt/LLM
<!-- gh-comment-id:2759534129 --> @rick-github commented on GitHub (Mar 27, 2025): The full path is there, was that manually created or a previous ollama run that succeeded? What's the result of ``` sudo -u ollama touch /mnt/LLM/touch.test ; ls -al /mnt/LLM ```
Author
Owner

@fanlessfan commented on GitHub (Mar 27, 2025):

sudo -u ollama touch /mnt/LLM/touch.test ; ls -al /mnt/LLM
total 4
drwxrwxrwx 1 ollama ollama 44 Mar 27 17:41 .
drwxr-xr-x 5 root root 4096 Mar 27 13:14 ..
drwxrwxrwx 1 ollama ollama 68 Mar 24 11:20 ollama
-rwxrwxrwx 1 1024 users 0 Mar 27 17:41 touch.test

I think it's NFS share permission problem. now I mapped all user to admin on the NFS share and the problem resolved. Thank you for the help

<!-- gh-comment-id:2759536708 --> @fanlessfan commented on GitHub (Mar 27, 2025): sudo -u ollama touch /mnt/LLM/touch.test ; ls -al /mnt/LLM total 4 drwxrwxrwx 1 ollama ollama 44 Mar 27 17:41 . drwxr-xr-x 5 root root 4096 Mar 27 13:14 .. drwxrwxrwx 1 ollama ollama 68 Mar 24 11:20 ollama -rwxrwxrwx 1 1024 users 0 Mar 27 17:41 touch.test I think it's NFS share permission problem. now I mapped all user to admin on the NFS share and the problem resolved. Thank you for the help
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68627