[GH-ISSUE #3438] Bug in MODEL download directory and launching ollama service in Linux #2119

Open
opened 2026-04-12 12:21:19 -05:00 by GiteaMirror · 23 comments
Owner

Originally created by @ejgutierrez74 on GitHub (Apr 1, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3438

I write this post to add more information:

1 - As you mentioned : I edited sudo systemctl edit ollama.service

imagen

And the /media/Samsung/ollama_models is empty....

imagen

So seems here a bug ( as said before the doc says you have to change the ollama.service file)

2 - ollama serve vs systemd

I run systemd start ollama ( today i booted my computer), and fails

imagen

But if i run ollama serve it seems to work ( i again just to be sure i started ollama, then see the status...and executed ollama serve):

imagen

And in other tab seems ollama works:
imagen

3 - where are the model downloaded:
As posted before /media/Samsung/ollama_models -> as you can see is empty
/home/ollama -> doesnt exist
imagen

and /usr/share/ollama ->
imagen

im going mad ;)

Thans for your help

Editing post for update: Finally i found the ollama model at /home/eduardo/.ollama, but it shouldnt be there as default directory is /usr/share/ollama/.ollama, and i set the environment variable OLLAMA_MODEL to point to /media/Samsung/ollama_models

Originally posted by @ejgutierrez74 in https://github.com/ollama/ollama/issues/3045#issuecomment-1991349181

Originally created by @ejgutierrez74 on GitHub (Apr 1, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3438 I write this post to add more information: 1 - As you mentioned : I edited `sudo systemctl edit ollama.service` ![imagen](https://github.com/ollama/ollama/assets/11474846/d82ca623-5b89-4e8c-8b25-81a82de0b7b3) And the /media/Samsung/ollama_models is empty.... ![imagen](https://github.com/ollama/ollama/assets/11474846/63001767-af41-4f47-823a-5c6506f3599d) So seems here a bug ( as said before the doc says you have to change the ollama.service file) 2 - ollama serve vs systemd I run systemd start ollama ( today i booted my computer), and fails ![imagen](https://github.com/ollama/ollama/assets/11474846/9449fd23-8a4f-4a06-abd1-f3339778ce91) But if i run ollama serve it seems to work ( i again just to be sure i started ollama, then see the status...and executed ollama serve): ![imagen](https://github.com/ollama/ollama/assets/11474846/a4c14ca7-4994-4497-a634-1ebad8cd1e77) And in other tab seems ollama works: ![imagen](https://github.com/ollama/ollama/assets/11474846/352524e4-ce54-4b9d-8ec1-e719f4a16b1d) 3 - where are the model downloaded: As posted before /media/Samsung/ollama_models -> as you can see is empty /home/ollama -> doesnt exist ![imagen](https://github.com/ollama/ollama/assets/11474846/9dbb5c4e-27ce-4503-b756-eab30b9efd72) and /usr/share/ollama -> ![imagen](https://github.com/ollama/ollama/assets/11474846/6b2e23b5-f245-4393-8b34-0ffde5705197) im going mad ;) Thans for your help Editing post for update: Finally i found the ollama model at /home/eduardo/.ollama, but it shouldnt be there as default directory is /usr/share/ollama/.ollama, and i set the environment variable OLLAMA_MODEL to point to /media/Samsung/ollama_models _Originally posted by @ejgutierrez74 in https://github.com/ollama/ollama/issues/3045#issuecomment-1991349181_
GiteaMirror added the buglinux labels 2026-04-12 12:21:19 -05:00
Author
Owner

@ejgutierrez74 commented on GitHub (Apr 17, 2024):

Why closed ? problem still happens....
Also commented in other post would be nice to add the ollama stop feature to make a clean stop and kill of ollama server process..

ollama 0.1.32
Ubuntu 22.04 latest updates

Captura desde 2024-04-17 11-56-39

Captura desde 2024-04-17 11-55-03

<!-- gh-comment-id:2061037223 --> @ejgutierrez74 commented on GitHub (Apr 17, 2024): Why closed ? problem still happens.... Also commented in other post would be nice to add the ollama stop feature to make a clean stop and kill of ollama server process.. ollama 0.1.32 Ubuntu 22.04 latest updates ![Captura desde 2024-04-17 11-56-39](https://github.com/ollama/ollama/assets/11474846/3172c817-8cdf-461b-914f-f261b67d3709) ![Captura desde 2024-04-17 11-55-03](https://github.com/ollama/ollama/assets/11474846/a3217f6c-3ad6-4a50-b278-7c14b64f75ea)
Author
Owner

@sprklinginfo commented on GitHub (May 9, 2024):

@ejgutierrez74 and others, I wanted to use a different directory for models on my Ubuntu 22 since it has larger space. I think you misunderstood how to update the 'ollama.service' file. In the service file, you added what you want to change above those commented-off lines. those new lines override the default settings. Make changes to those commented lines won't work as they are commented off :-). Anyway, here is my updated service file:

### Anything between here and the comment below will become the new contents of the file

[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_MODELS=/new/directory/for/my/local/models"

### Lines below this comment will be discarded

### /etc/systemd/system/ollama.service
# [Unit]
# Description=Ollama Service
# After=network-online.target
#
# [Service]
# ExecStart=/usr/local/bin/ollama serve
# User=ollama
# Group=ollama
# Restart=always
# RestartSec=3
# Environment="PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games"
....
```. 
Then as the document said: 
```shell
Save and exit.

Reload systemd and restart Ollama:

systemctl daemon-reload
systemctl restart ollama
```.  I had no problem to restart the ollama service and pulled llama3 again, it was stored to my new directory. Hope it helps.
<!-- gh-comment-id:2103290158 --> @sprklinginfo commented on GitHub (May 9, 2024): @ejgutierrez74 and others, I wanted to use a different directory for models on my Ubuntu 22 since it has larger space. I think you misunderstood how to update the 'ollama.service' file. In the service file, you added what you want to change above those commented-off lines. those new lines override the default settings. Make changes to those commented lines won't work as they are commented off :-). Anyway, here is my updated service file: ```shell ### Anything between here and the comment below will become the new contents of the file [Service] Environment="OLLAMA_HOST=0.0.0.0" Environment="OLLAMA_MODELS=/new/directory/for/my/local/models" ### Lines below this comment will be discarded ### /etc/systemd/system/ollama.service # [Unit] # Description=Ollama Service # After=network-online.target # # [Service] # ExecStart=/usr/local/bin/ollama serve # User=ollama # Group=ollama # Restart=always # RestartSec=3 # Environment="PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games" .... ```. Then as the document said: ```shell Save and exit. Reload systemd and restart Ollama: systemctl daemon-reload systemctl restart ollama ```. I had no problem to restart the ollama service and pulled llama3 again, it was stored to my new directory. Hope it helps.
Author
Owner

@ejgutierrez74 commented on GitHub (May 16, 2024):

One thing the service as you can see is opened in nano, and nano in every line puts a # symbol at the begining of each line...if you see the same file in gedit for example:

imagen

I got the problem:

imagen

I cant start the service, but if i execute in terminal:
$ollama serve it works
( but in the default directory...now id try to add the OLLAMA_MODEL in environment as you suggest)

Edit: ollama latest version 0.1.37

<!-- gh-comment-id:2114634803 --> @ejgutierrez74 commented on GitHub (May 16, 2024): One thing the service as you can see is opened in nano, and nano in every line puts a # symbol at the begining of each line...if you see the same file in gedit for example: ![imagen](https://github.com/ollama/ollama/assets/11474846/6930f789-75b9-4056-a458-78840106f405) I got the problem: ![imagen](https://github.com/ollama/ollama/assets/11474846/d40b6021-1219-4f70-b542-899e9b136a29) I cant start the service, but if i execute in terminal: $ollama serve it works ( but in the default directory...now id try to add the OLLAMA_MODEL in environment as you suggest) Edit: ollama latest version 0.1.37
Author
Owner

@sprklinginfo commented on GitHub (May 16, 2024):

I think it is not recommended that you modify the default /etc/systemd/system/ollama.service file as it may cause problems for future updates.

I followed instructions here: https://github.com/ollama/ollama/blob/main/docs/faq.md

$ sudo systemctl edit ollama.service

It will open a ollama.service file, but not that default file. What you enter here will be merged with the default version, e.g. lines here will override the lines in the default version.

So I added those three lines, as shown in my previous message.
then run the following commands:

$ sudo systemctl daemon-reload
$ sudo systemctl restart ollama

@ejgutierrez74 . I hope it helps. BTW, when looking at your screenshot, I can see your model path starts with double slashes, "//media", is it correct? for mine, I used another mounted disk which I only need to specify with a single slash.

<!-- gh-comment-id:2115783081 --> @sprklinginfo commented on GitHub (May 16, 2024): I think it is not recommended that you modify the default `/etc/systemd/system/ollama.service` file as it may cause problems for future updates. I followed instructions here: [https://github.com/ollama/ollama/blob/main/docs/faq.md](https://github.com/ollama/ollama/blob/main/docs/faq.md) ```shell $ sudo systemctl edit ollama.service ``` It will open a `ollama.service` file, but not that default file. What you enter here will be merged with the default version, e.g. lines here will override the lines in the default version. So I added those three lines, as shown in my previous message. then run the following commands: ```shell $ sudo systemctl daemon-reload $ sudo systemctl restart ollama ``` @ejgutierrez74 . I hope it helps. BTW, when looking at your screenshot, I can see your model path starts with double slashes, "//media", is it correct? for mine, I used another mounted disk which I only need to specify with a single slash.
Author
Owner

@ejgutierrez74 commented on GitHub (May 17, 2024):

No luck...can you please when you execute ollama it appear the new OLLAMA_MODELS environment variable at the beginning....

imagen

I correct what you said about //media and change the location of the OLLAMA_MODELS:

imagen

Again systemd didnt work:

eduardo@MiPcLinux:~$ sudo systemctl daemon-reload
eduardo@MiPcLinux:~$ sudo systemctl restart ollama
eduardo@MiPcLinux:~$ sudo systemctl status ollama
● ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled)
    Drop-In: /etc/systemd/system/ollama.service.d
             └─override.conf
     Active: activating (auto-restart) (Result: exit-code) since Fri 2024-05-17 11:19:23 CEST; 6>
    Process: 115059 ExecStart=/usr/local/bin/ollama serve (code=exited, status=1/FAILURE)
   Main PID: 115059 (code=exited, status=1/FAILURE)
        CPU: 26ms

Again as you can see in first picture the ollama serve works....

After running daemon-reload...i look again in
eduardo@MiPcLinux:~$ sudo gedit /etc/systemd/system/ollama.service

and nothing changes, the OLLAMA_MODELS dont appear, dont know if is ok or wrong...

Also i didnt change or write OLLAMA_HOSTS, as the default location localhost its ok for me.

Hope you can help me.

<!-- gh-comment-id:2117143025 --> @ejgutierrez74 commented on GitHub (May 17, 2024): No luck...can you please when you execute ollama it appear the new OLLAMA_MODELS environment variable at the beginning.... ![imagen](https://github.com/ollama/ollama/assets/11474846/34c3aed5-9c0a-40a2-9456-0f6910e3ff10) I correct what you said about //media and change the location of the OLLAMA_MODELS: ![imagen](https://github.com/ollama/ollama/assets/11474846/bf861ecd-adb7-4848-b1b9-f02800e22826) Again systemd didnt work: ``` eduardo@MiPcLinux:~$ sudo systemctl daemon-reload eduardo@MiPcLinux:~$ sudo systemctl restart ollama eduardo@MiPcLinux:~$ sudo systemctl status ollama ● ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled) Drop-In: /etc/systemd/system/ollama.service.d └─override.conf Active: activating (auto-restart) (Result: exit-code) since Fri 2024-05-17 11:19:23 CEST; 6> Process: 115059 ExecStart=/usr/local/bin/ollama serve (code=exited, status=1/FAILURE) Main PID: 115059 (code=exited, status=1/FAILURE) CPU: 26ms ``` Again as you can see in first picture the ollama serve works.... After running daemon-reload...i look again in ` eduardo@MiPcLinux:~$ sudo gedit /etc/systemd/system/ollama.service` and nothing changes, the OLLAMA_MODELS dont appear, dont know if is ok or wrong... Also i didnt change or write OLLAMA_HOSTS, as the default location localhost its ok for me. Hope you can help me.
Author
Owner

@sprklinginfo commented on GitHub (May 17, 2024):

again, when running sudo systemctl edit ollama.service, it doesn't edit the default /etc/systemd/system/ollama.service file. you can see the first line which states the file opens for edit is called /etc/systemd/system/ollama.service.d/override.conf file.

After running sudo systemctl start ollama, if it shows error again, you can always run journalctl -u ollama.service to view the log and check if there is any additional information about the error.

<!-- gh-comment-id:2118148963 --> @sprklinginfo commented on GitHub (May 17, 2024): again, when running `sudo systemctl edit ollama.service`, it doesn't edit the default `/etc/systemd/system/ollama.service` file. you can see the first line which states the file opens for edit is called `/etc/systemd/system/ollama.service.d/override.conf` file. After running `sudo systemctl start ollama`, if it shows error again, you can always run `journalctl -u ollama.service` to view the log and check if there is any additional information about the error.
Author
Owner

@ejgutierrez74 commented on GitHub (May 18, 2024):

Thanks for your help:
1 - I think the config is ok no ? I did now in the right place
imagen

2 - This operation: sudo systemctl edit ollama.service and editing the file, should be made every time you boot the computer ? Or only one time and thats all ?

3 - This is my jounalctl -u ollama.service looks like:

imagen

I still think that is a bug in ollama, should be easier to change the default directory....

Thanks again

<!-- gh-comment-id:2118744460 --> @ejgutierrez74 commented on GitHub (May 18, 2024): Thanks for your help: 1 - I think the config is ok no ? I did now in the right place ![imagen](https://github.com/ollama/ollama/assets/11474846/eb778649-b40c-4d5f-9413-cf2f80404f93) 2 - This operation: sudo systemctl edit ollama.service and editing the file, should be made every time you boot the computer ? Or only one time and thats all ? 3 - This is my jounalctl -u ollama.service looks like: ![imagen](https://github.com/ollama/ollama/assets/11474846/5912ecb5-7d6a-462c-a282-776cf764e71d) I still think that is a bug in ollama, should be easier to change the default directory.... Thanks again
Author
Owner

@sprklinginfo commented on GitHub (May 18, 2024):

the log clearly shows there were some permission issues. you need to fix it according to those lines:
Snipaste_2024-05-18_11-14-55

<!-- gh-comment-id:2118854861 --> @sprklinginfo commented on GitHub (May 18, 2024): the log clearly shows there were some permission issues. you need to fix it according to those lines: ![Snipaste_2024-05-18_11-14-55](https://github.com/ollama/ollama/assets/3521826/ca911097-bc5e-472e-9974-6f2f376f2b24)
Author
Owner

@ejgutierrez74 commented on GitHub (May 19, 2024):

I dont see any permissions problem, my username is eduardo which is in group root and ollama.

imagen

By the way should i edit the ollama.service every time i boot computer ?

Thanks

<!-- gh-comment-id:2119179572 --> @ejgutierrez74 commented on GitHub (May 19, 2024): I dont see any permissions problem, my username is eduardo which is in group root and ollama. ![imagen](https://github.com/ollama/ollama/assets/11474846/6e854c21-1c38-4b7a-8eeb-92a59c18b02a) By the way should i edit the ollama.service every time i boot computer ? Thanks
Author
Owner

@JT-McLeod commented on GitHub (Jun 15, 2024):

I have recently needed to shift my models directory and now I face these same problems.
I have run sudo systemctl edit ollama.service and made the suggested changes. But that doesn't work.

However I have found that I can reach the models directory on a different disk if I start the server by
OLLAMA_MODELS=<path/to/models> ollama serve

But this is not how I want to run ollama. I would rather run it by systemctl.
However doing that generates the response
Jun 15 14:54:19 Fiddler ollama[32949]: Error: mkdir /media/jim/a10TB: permission denied
Jun 15 14:54:19 Fiddler systemd[1]: ollama.service: Main process exited, code=exited, stat>
Jun 15 14:54:19 Fiddler systemd[1]: ollama.service: Failed with result 'exit-code'.

The permissions are:
drwxr-xr-x 4 ollama ollama 4096 Nov 17 2023 models/

Thanks

<!-- gh-comment-id:2169079649 --> @JT-McLeod commented on GitHub (Jun 15, 2024): I have recently needed to shift my models directory and now I face these same problems. I have run sudo systemctl edit ollama.service and made the suggested changes. But that doesn't work. However I have found that I can reach the models directory on a different disk if I start the server by OLLAMA_MODELS=<path/to/models> ollama serve But this is not how I want to run ollama. I would rather run it by systemctl. However doing that generates the response Jun 15 14:54:19 Fiddler ollama[32949]: Error: mkdir /media/jim/a10TB: permission denied Jun 15 14:54:19 Fiddler systemd[1]: ollama.service: Main process exited, code=exited, stat> Jun 15 14:54:19 Fiddler systemd[1]: ollama.service: Failed with result 'exit-code'. The permissions are: drwxr-xr-x 4 ollama ollama 4096 Nov 17 2023 models/ Thanks
Author
Owner

@ejgutierrez74 commented on GitHub (Jun 15, 2024):

@jmorganca can be this fixed please ? Its a important bug, because most people need to change the directory of the models because of right permissions, free space, business policy etc...

Thanks

<!-- gh-comment-id:2169251767 --> @ejgutierrez74 commented on GitHub (Jun 15, 2024): @jmorganca can be this fixed please ? Its a important bug, because most people need to change the directory of the models because of right permissions, free space, business policy etc... Thanks
Author
Owner

@JT-McLeod commented on GitHub (Jun 16, 2024):

I have recently needed to shift my models directory and now I face these same problems.
I have run sudo systemctl edit ollama.service and made the suggested changes. But that doesn't work.

However I have found that I can reach the models directory on a different disk if I start the server by
OLLAMA_MODELS=<path/to/models> ollama serve

But this is not how I want to run ollama. I would rather run it by systemctl.
However doing that generates the response
Jun 15 14:54:19 Fiddler ollama[32949]: Error: mkdir /media/jim/a10TB: permission denied
Jun 15 14:54:19 Fiddler systemd[1]: ollama.service: Main process exited, code=exited, stat>
Jun 15 14:54:19 Fiddler systemd[1]: ollama.service: Failed with result 'exit-code'.

The permissions are:
drwxr-xr-x 4 ollama ollama 4096 Nov 17 2023 models/

Thanks

I have found that symlink doesn't work but mount --bind does.
Forget trying to use the Environment Variable.

<!-- gh-comment-id:2171364052 --> @JT-McLeod commented on GitHub (Jun 16, 2024): I have recently needed to shift my models directory and now I face these same problems. I have run sudo systemctl edit ollama.service and made the suggested changes. But that doesn't work. However I have found that I can reach the models directory on a different disk if I start the server by OLLAMA_MODELS=<path/to/models> ollama serve But this is not how I want to run ollama. I would rather run it by systemctl. However doing that generates the response Jun 15 14:54:19 Fiddler ollama[32949]: Error: mkdir /media/jim/a10TB: permission denied Jun 15 14:54:19 Fiddler systemd[1]: ollama.service: Main process exited, code=exited, stat> Jun 15 14:54:19 Fiddler systemd[1]: ollama.service: Failed with result 'exit-code'. The permissions are: drwxr-xr-x 4 ollama ollama 4096 Nov 17 2023 models/ Thanks I have found that symlink doesn't work but mount --bind does. Forget trying to use the Environment Variable.
Author
Owner

@SnailWhb commented on GitHub (Jul 17, 2024):

@ejgutierrez74 @JT-McLeod I face these same problems. Have you solved it?

<!-- gh-comment-id:2233399847 --> @SnailWhb commented on GitHub (Jul 17, 2024): @ejgutierrez74 @JT-McLeod I face these same problems. Have you solved it?
Author
Owner

@JT-McLeod commented on GitHub (Jul 18, 2024):

I have a solution that works for me. As mentioned above it is similar to a symlink but uses the command "mount --bind"
On my ubuntu machine I have a 2nd drive called a10TB, and I want the models placed in folder /models.

sudo mount --bind /media/jim/a10TB/models /usr/share/ollama/.ollama/models
Then start the server without the OLLAMA_MODELS assignment.

To have it happen at restart then
In /etc/fstab after mounting a10TB I have
/media/jim/a10TB/models /usr/share/ollama/.ollama/models none bind,rw 0 0

<!-- gh-comment-id:2235825639 --> @JT-McLeod commented on GitHub (Jul 18, 2024): I have a solution that works for me. As mentioned above it is similar to a symlink but uses the command "mount --bind" On my ubuntu machine I have a 2nd drive called a10TB, and I want the models placed in folder /models. sudo mount --bind /media/jim/a10TB/models /usr/share/ollama/.ollama/models Then start the server without the OLLAMA_MODELS assignment. To have it happen at restart then In /etc/fstab after mounting a10TB I have /media/jim/a10TB/models /usr/share/ollama/.ollama/models none bind,rw 0 0
Author
Owner

@ejgutierrez74 commented on GitHub (Jul 18, 2024):

I think its a bug that have to be solved, and seems not too difficult @jmorganca

<!-- gh-comment-id:2236110042 --> @ejgutierrez74 commented on GitHub (Jul 18, 2024): I think its a bug that have to be solved, and seems not too difficult @jmorganca
Author
Owner

@openelearning commented on GitHub (Feb 20, 2025):

Hello, same needs and problems with debian 12.
The documentation in install linux say https://github.com/ollama/ollama/blob/main/docs/linux.md "Create a service file in /etc/systemd/system/ollama.service:" but when changing OLLAMA_MODELS there is no effect after sudo systemctl daemon-reload
and sudo systemctl enable ollama.
In https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server it says "Edit the systemd service by calling systemctl edit ollama.service"
So reading this issue, it seems that the faq is the good way and the install doc the wrong way ?
edit : it worked for me :

sudo systemctl edit ollama.service

### Anything between here and the comment below will become the new contents of the file
Adding :
[Service]
Environment="OLLAMA_MODELS=/the/path/you/want"
### Lines below this comment will be discarded

Note : it's important to add [Service] or it will not work.
sudo systemctl daemon-reload
sudo systemctl restart ollama

Now ollama use the new path.

<!-- gh-comment-id:2672055924 --> @openelearning commented on GitHub (Feb 20, 2025): Hello, same needs and problems with debian 12. The documentation in install linux say https://github.com/ollama/ollama/blob/main/docs/linux.md "Create a service file in /etc/systemd/system/ollama.service:" but when changing OLLAMA_MODELS there is no effect after sudo systemctl daemon-reload and sudo systemctl enable ollama. In https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server it says "Edit the systemd service by calling systemctl edit ollama.service" So reading this issue, it seems that the faq is the good way and the install doc the wrong way ? edit : it worked for me : sudo systemctl edit ollama.service ``` ### Anything between here and the comment below will become the new contents of the file Adding : [Service] Environment="OLLAMA_MODELS=/the/path/you/want" ### Lines below this comment will be discarded ``` Note : it's important to add [Service] or it will not work. sudo systemctl daemon-reload sudo systemctl restart ollama Now ollama use the new path.
Author
Owner

@Chiramisudo commented on GitHub (Sep 6, 2025):

I have spent several hours trying to get this to work on version 0.11.8 tweaking the following, and indeed more:

  • symlink (i.e. # ln -s ...)
  • # mount --bind ...
  • # chown -R ollama:ollama ... and chmod -R 755 ...
  • With and without the above paired with and without the following overrides via # systemctl edit ollama
    [Service]
    Environment="OLLAMA_MODELS=/home/user/.ollama/models"
    ProtectHome=no  # 🚨 I see no conditionals in the code based on this variable at all.
    

The issue, as far as I can tell from the code (I'm not a Go programmer) is not related to the configuration, as it is clear from the system journal that it is correctly reading the overridden model path. Rather, the issue is in the test, and from my brief review of the code, appears to be in the following places where 🚨 os.MkdirAll are used, since this function creates all parent paths, which most often will not be necessary, and when overridden by the user, should be left to the user to configure (adequate documentation should cover the steps to do this):

<!-- gh-comment-id:3260148576 --> @Chiramisudo commented on GitHub (Sep 6, 2025): I have spent several hours trying to get this to work on version 0.11.8 tweaking the following, and indeed more: - symlink (i.e. `# ln -s ...`) - `# mount --bind ...` - `# chown -R ollama:ollama ...` and `chmod -R 755 ...` - With and without the above paired with and without the following overrides via `# systemctl edit ollama` ```sh [Service] Environment="OLLAMA_MODELS=/home/user/.ollama/models" ProtectHome=no # 🚨 I see no conditionals in the code based on this variable at all. ``` The issue, as far as I can tell from the code (I'm not a Go programmer) is not related to the configuration, as it is clear from the system journal that it is correctly reading the overridden model path. Rather, the issue is in the test, and from my brief review of the code, appears to be in the following places where 🚨 [`os.MkdirAll`](https://pkg.go.dev/os#MkdirAll) are used, since this function creates all parent paths, which most often will not be necessary, and when overridden by the user, should be left to the user to configure (adequate documentation should cover the steps to do this): - [`server/manifest_test.go:createManifest(...)`](https://github.com/ollama/ollama/blob/4378ae4ffaaf71b649efcf87a6a3f77cb923f822/server/manifest_test.go#L17) - [`server/routes_create_test.go:TestDetectModelTypeFromFiles(...)`](https://github.com/ollama/ollama/blob/4378ae4ffaaf71b649efcf87a6a3f77cb923f822/server/routes_create_test.go#L757)
Author
Owner

@ejgutierrez74 commented on GitHub (Sep 24, 2025):

WHy is not this fixed ?? Old bug that affects a wide used distro almost standard like Ubuntu....

<!-- gh-comment-id:3327318825 --> @ejgutierrez74 commented on GitHub (Sep 24, 2025): WHy is not this fixed ?? Old bug that affects a wide used distro almost standard like Ubuntu....
Author
Owner

@alwayssummer commented on GitHub (Oct 19, 2025):

After setting the environment variable failed, I had success with bind. If you're going to bind, let the environment variable revert back to the default.

sudo mount --bind /media/username/Data/Ollama/models /usr/share/ollama/.ollama/models

<!-- gh-comment-id:3420061425 --> @alwayssummer commented on GitHub (Oct 19, 2025): After setting the environment variable failed, I had success with bind. If you're going to bind, let the environment variable revert back to the default. `sudo mount --bind /media/username/Data/Ollama/models /usr/share/ollama/.ollama/models`
Author
Owner

@mainguyenanhvu commented on GitHub (Mar 17, 2026):

@ejgutierrez74 I faced the same issue when editing ollama.service file.
I ran sudo systemctl edit ollama.service or sudo systemctl edit ollama, then add:

[Service]
Environment="OLLAMA_HOST=0.0.0.0:11434"

save and close. However, it threw an annoncement:

Editing "/etc/systemd/system/ollama.service.d/override.conf" canceled: temporary file is empty.

How do you overcome it?

<!-- gh-comment-id:4073378597 --> @mainguyenanhvu commented on GitHub (Mar 17, 2026): @ejgutierrez74 I faced the same issue when editing ollama.service file. I ran `sudo systemctl edit ollama.service` or `sudo systemctl edit ollama`, then add: ``` [Service] Environment="OLLAMA_HOST=0.0.0.0:11434" ``` save and close. However, it threw an annoncement: ``` Editing "/etc/systemd/system/ollama.service.d/override.conf" canceled: temporary file is empty. ``` How do you overcome it?
Author
Owner

@darouwan commented on GitHub (Mar 23, 2026):

@mainguyenanhvu The same... the error still happen on ubuntu. I have tried vim.

<!-- gh-comment-id:4107540446 --> @darouwan commented on GitHub (Mar 23, 2026): @mainguyenanhvu The same... the error still happen on ubuntu. I have tried vim.
Author
Owner

@darouwan commented on GitHub (Mar 23, 2026):

@mainguyenanhvu
My current solution is to edit with vim /lib/systemd/system/ollama.service

Add

[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"
<!-- gh-comment-id:4107630996 --> @darouwan commented on GitHub (Mar 23, 2026): @mainguyenanhvu My current solution is to edit with `vim /lib/systemd/system/ollama.service` Add ``` [Service] Environment="OLLAMA_HOST=0.0.0.0" Environment="OLLAMA_ORIGINS=*" ```
Author
Owner

@KnowledgeSuppository commented on GitHub (Mar 25, 2026):

Perhaps the following that helped for me, will do so for you.

Important note:
Since / If you've already pulled models to the default location, you might want to move them to the new location to avoid re-downloading:
Stop Ollama first
sudo systemctl stop ollama.service

Move existing models to the new location (if they exist)
sudo mv /usr/share/ollama/.ollama/models/* //media/Samsung/ollama_models/ 2>/dev/null || true

Or if models are in the user's home directory
sudo mv ~/.ollama/models/* //media/Samsung/ollama_models/ 2>/dev/null || true

Restart Ollama
sudo systemctl start ollama.service

Step 1:

Stop Ollama first
sudo systemctl stop ollama.service

Step 2: Create the override directory and file manually
Create the override directory
sudo mkdir -p /etc/systemd/system/ollama.service.d

Create and edit the override file
sudo nano /etc/systemd/system/ollama.service.d/override.conf

Step 3: Add the correct configuration
In the editor, paste this:
[Service]
Environment="OLLAMA_MODELS=/media/Samsung/ollama_models"

Step 4: Verify the file was created
Check the file exists and view its contents
sudo cat /etc/systemd/system/ollama.service.d/override.conf

Step 5: Reload and restart
Then restart:
sudo systemctl daemon-reload
sudo systemctl start ollama.service

Step 6: Verify the environment variable is set
Check if the environment variable is being picked up
sudo systemctl show ollama.service --property=Environment

Check the actual process
ps aux | grep ollama

Best of luck!

<!-- gh-comment-id:4124174003 --> @KnowledgeSuppository commented on GitHub (Mar 25, 2026): Perhaps the following that helped for me, will do so for you. **Important note:** Since / If you've already pulled models to the default location, you might want to move them to the new location to avoid re-downloading: Stop Ollama first sudo systemctl stop ollama.service Move existing models to the new location (if they exist) sudo mv /usr/share/ollama/.ollama/models/* //media/Samsung/ollama_models/ 2>/dev/null || true Or if models are in the user's home directory sudo mv ~/.ollama/models/* //media/Samsung/ollama_models/ 2>/dev/null || true Restart Ollama sudo systemctl start ollama.service **Step 1:** Stop Ollama first sudo systemctl stop ollama.service **Step 2:** Create the override directory and file manually Create the override directory sudo mkdir -p /etc/systemd/system/ollama.service.d Create and edit the override file sudo nano /etc/systemd/system/ollama.service.d/override.conf **Step 3:** Add the correct configuration In the editor, paste this: [Service] Environment="OLLAMA_MODELS=/media/Samsung/ollama_models" **Step 4:** Verify the file was created Check the file exists and view its contents sudo cat /etc/systemd/system/ollama.service.d/override.conf **Step 5:** Reload and restart Then restart: sudo systemctl daemon-reload sudo systemctl start ollama.service **Step 6:** Verify the environment variable is set Check if the environment variable is being picked up sudo systemctl show ollama.service --property=Environment Check the actual process ps aux | grep ollama Best of luck!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2119