[GH-ISSUE #4454] OLLAMA_MODELS environment variable is not respected #2783

Closed
opened 2026-04-12 13:06:34 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @JLCarveth on GitHub (May 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4454

What is the issue?

I have followed the steps here to change where Ollama stores the downloaded models.

I make sure to run systemctl daemon-reload and to restart the ollama service, and yet it is still storing the model blobs in /usr/share/ollama/... instead of the location specified in OLLAMA_MODELS.

I expect Ollama to download the models to the specified location. I have insufficient space left on my root partition, hence why I am trying to download the models to my home directory instead.

sudo systemctl edit ollama.service:

### Anything between here and the comment below will become the contents of the drop-in file

Environment="OLLAMA_MODELS=/home/jlcarveth/.ollama/models"

OS

Linux

GPU

AMD

CPU

Intel

Ollama version

0.1.32

Originally created by @JLCarveth on GitHub (May 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4454 ### What is the issue? I have followed the steps [here](https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored) to change where Ollama stores the downloaded models. I make sure to run `systemctl daemon-reload` and to restart the ollama service, and yet it is still storing the model blobs in `/usr/share/ollama/...` instead of the location specified in `OLLAMA_MODELS`. I expect Ollama to download the models to the specified location. I have insufficient space left on my root partition, hence why I am trying to download the models to my home directory instead. `sudo systemctl edit ollama.service`: ``` ### Anything between here and the comment below will become the contents of the drop-in file Environment="OLLAMA_MODELS=/home/jlcarveth/.ollama/models" ``` ### OS Linux ### GPU AMD ### CPU Intel ### Ollama version 0.1.32
GiteaMirror added the bug label 2026-04-12 13:06:34 -05:00
Author
Owner

@mxyng commented on GitHub (May 15, 2024):

Your drop in files is missing a [Service] header so systemd doesn't know what section the configuration belongs in.

This should be the full contents of the configuration file:

[Service]
Environment="OLLAMA_MODELS=/home/jlcarveth/.ollama/models"

The FAQ example uses OLLAMA_HOST but the idea is the same

<!-- gh-comment-id:2112849697 --> @mxyng commented on GitHub (May 15, 2024): Your drop in files is missing a `[Service]` header so systemd doesn't know what section the configuration belongs in. This should be the full contents of the configuration file: ``` [Service] Environment="OLLAMA_MODELS=/home/jlcarveth/.ollama/models" ``` The FAQ [example](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux) uses `OLLAMA_HOST` but the idea is the same
Author
Owner

@icf20 commented on GitHub (Jun 6, 2024):

@mxyng setting said variable like u posted it prevents ollama from starting

cat /etc/systemd/system/ollama.service.d/override.conf                                                                                                                      
[Service]
Environment="OLLAMA_MODELS=/mnt/media/ollama/models"
systemctl status ollama                                                                                                                                                   ✔ 
● ollama.service - Ollama Service
     Loaded: loaded (/usr/lib/systemd/system/ollama.service; disabled; preset: disabled)
    Drop-In: /etc/systemd/system/ollama.service.d
             └─override.conf
     Active: activating (auto-restart) (Result: exit-code) since Thu 2024-06-06 13:53:45 CEST; 2s ago
    Process: 23510 ExecStart=/usr/bin/ollama serve (code=exited, status=1/FAILURE)
   Main PID: 23510 (code=exited, status=1/FAILURE)
        CPU: 26ms
<!-- gh-comment-id:2152210991 --> @icf20 commented on GitHub (Jun 6, 2024): @mxyng setting said variable like u posted it prevents ollama from starting ``` cat /etc/systemd/system/ollama.service.d/override.conf [Service] Environment="OLLAMA_MODELS=/mnt/media/ollama/models" ``` ``` systemctl status ollama  ✔ ● ollama.service - Ollama Service Loaded: loaded (/usr/lib/systemd/system/ollama.service; disabled; preset: disabled) Drop-In: /etc/systemd/system/ollama.service.d └─override.conf Active: activating (auto-restart) (Result: exit-code) since Thu 2024-06-06 13:53:45 CEST; 2s ago Process: 23510 ExecStart=/usr/bin/ollama serve (code=exited, status=1/FAILURE) Main PID: 23510 (code=exited, status=1/FAILURE) CPU: 26ms ```
Author
Owner

@icf20 commented on GitHub (Jun 6, 2024):

for anyone reading or future me

had to copy usr/lib/systemd/system/ollama.service and edit that

### Anything between here and the comment below will become the contents of the drop-in file

[Unit]
Description=Ollama Service
Wants=network-online.target
After=network.target network-online.target

[Service]
#ExecStart=/usr/bin/ollama serve
#WorkingDirectory=/var/lib/ollama
Environment="GIN_MODE=release" "HOME=/mnt/media/ollama" "OLLAMA_MODELS=/mnt/media/ollama/models"
User=user
Group=user
Restart=on-failure
RestartSec=3
Type=simple
PrivateTmp=yes
ProtectSystem=full
ProtectHome=yes
<!-- gh-comment-id:2152748370 --> @icf20 commented on GitHub (Jun 6, 2024): for anyone reading or future me had to copy usr/lib/systemd/system/ollama.service and edit that ``` ### Anything between here and the comment below will become the contents of the drop-in file [Unit] Description=Ollama Service Wants=network-online.target After=network.target network-online.target [Service] #ExecStart=/usr/bin/ollama serve #WorkingDirectory=/var/lib/ollama Environment="GIN_MODE=release" "HOME=/mnt/media/ollama" "OLLAMA_MODELS=/mnt/media/ollama/models" User=user Group=user Restart=on-failure RestartSec=3 Type=simple PrivateTmp=yes ProtectSystem=full ProtectHome=yes ```
Author
Owner

@xnming commented on GitHub (Jan 26, 2025):

After researching online and consulting with ChatGPT, I was able to resolve the issue successfully. For anyone facing a similar problem on Linux, ensure that the ollama user has write permissions for the specified path. Below is a complete tutorial on setting up OLLAMA_MODELS:

1. Identify the User Running the Ollama Service

To ensure the correct permissions are applied, determine the username of the account running the Ollama service.

ps aux | grep ollama    # in most cases, the username would be 'ollama'

The username will appear in the first column of the output.

2. Create a New Folder for Models

Choose or create a directory for storing the models

mkdir -p /path/to/ollama/models

Replace /path/to/ollama/models with your desired directory path.

3. Change Ownership and Permissions

Grant ownership and full write permissions of the directory to the user running the Ollama service.

sudo chown -R ollama_user /path/to/ollama/models
sudo chmod -R 755 /path/to/ollama/models

Replace ollama_user with the username obtained in Step 1.

4. Edit the Ollama Service Configuration

Update the Ollama service configuration to include the OLLAMA_MODELS environment variable.

  1. Open the service file
sudo systemctl edit ollama.service 
  1. Add the following lines under the [Service] section:
[Service]
Environment="OLLAMA_MODELS=/path/to/ollama/models"
  1. Save and exit the editor.

5. Reload and Restart the Ollama Service

Apply the changes by reloading the system configuration and restarting the service.

sudo systemctl daemon-reload
sudo systemctl restart ollama

6. Verify the Configuration

Ensure the service is running without errors:

sudo systemctl status ollama

Test the connectivity:

curl http://localhost:11434/api/version

If the service responds with the version details, the configuration is successful.

<!-- gh-comment-id:2614231357 --> @xnming commented on GitHub (Jan 26, 2025): After researching online and consulting with ChatGPT, I was able to resolve the issue successfully. For anyone facing a similar problem on Linux, ensure that the `ollama` user has write permissions for the specified path. Below is a complete tutorial on setting up `OLLAMA_MODELS`: ### 1. Identify the User Running the Ollama Service To ensure the correct permissions are applied, determine the username of the account running the Ollama service. ```bash ps aux | grep ollama # in most cases, the username would be 'ollama' ``` The username will appear in the first column of the output. ### 2. Create a New Folder for Models Choose or create a directory for storing the models ```bash mkdir -p /path/to/ollama/models ``` Replace /path/to/ollama/models with your desired directory path. ### 3. Change Ownership and Permissions Grant ownership and full write permissions of the directory to the user running the Ollama service. ```bash sudo chown -R ollama_user /path/to/ollama/models sudo chmod -R 755 /path/to/ollama/models ``` Replace ollama_user with the username obtained in Step 1. ### 4. Edit the Ollama Service Configuration Update the Ollama service configuration to include the OLLAMA_MODELS environment variable. 1. Open the service file ```bash sudo systemctl edit ollama.service ``` 2. Add the following lines under the `[Service]` section: ```bash [Service] Environment="OLLAMA_MODELS=/path/to/ollama/models" ``` 3. Save and exit the editor. ### 5. Reload and Restart the Ollama Service Apply the changes by reloading the system configuration and restarting the service. ```bash sudo systemctl daemon-reload sudo systemctl restart ollama ``` ### 6. Verify the Configuration Ensure the service is running without errors: ```bash sudo systemctl status ollama ``` Test the connectivity: ```bash curl http://localhost:11434/api/version ``` If the service responds with the version details, the configuration is successful.
Author
Owner

@donymorph commented on GitHub (Mar 18, 2025):

I don't know. Environment="OLLAMA_MODELS=/path/to/ollama/models" didnt work for me. Instead I used symbolic link
ln -s model/stored/path ollama/recognized/path
example
sudo ln -s /media/hdd/models /home/officepc/.ollama

<!-- gh-comment-id:2733534151 --> @donymorph commented on GitHub (Mar 18, 2025): I don't know. Environment="OLLAMA_MODELS=/path/to/ollama/models" didnt work for me. Instead I used symbolic link `ln -s model/stored/path ollama/recognized/path` example `sudo ln -s /media/hdd/models /home/officepc/.ollama`
Author
Owner

@abubakr380 commented on GitHub (Aug 26, 2025):

My problem was with the permissions of the parent models folder. It was initially created by ollama user, and I later changed the user in systemd file. So now, that new user doesn't have write permission (ollama executes mkdir to make sure the parent directory exists).
To check permissions: sudo ls -la /usr/share/ollama/.ollama

Solution: Change directory owner to new user sudo chown -R new_user /usr/share/ollama/ (replace new_user)

<!-- gh-comment-id:3224059836 --> @abubakr380 commented on GitHub (Aug 26, 2025): My problem was with the permissions of the parent models folder. It was initially created by `ollama` user, and I later changed the user in systemd file. So now, that new user doesn't have write permission (ollama executes mkdir to make sure the parent directory exists). To check permissions: `sudo ls -la /usr/share/ollama/.ollama` Solution: Change directory owner to new user `sudo chown -R new_user /usr/share/ollama/` (replace new_user)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2783