[GH-ISSUE #10246] Linux tips- sharing models directory with Msty #6722

Closed
opened 2026-04-12 18:28:24 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Dunravin on GitHub (Apr 12, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10246

I had a lot of head scratching getting Msty to share the Ollama models directory and retain normal Ollama functions.

Posting my process here in the hopes of helping others get the job done.

First you're going to need to create a folder for the models in your systems /home folder but not in your users home folder go up a level.

sudo mkdir /home/.ollama_models

give the directory to ollama user and group

sudo chown ollama:ollama /home/.ollama_models

and modify it's permissions to be readable by all users

sudo chmod -R a+r /home/.ollama_models

next modify the ollama.service environment variables with

sudo systemctl edit ollama.service

This will open a temporary overrides file for the ollama.service. Read what it says and enter in the top section.

[Service]
Environment="OLLAMA_MODELS=/home/.ollama_models"

write out, confirm and quit the editor then

sudo systemctl daemon-reload
sudo systemctl restart ollama

confirm your override is being applied and the service is running with sudo systemctl status ollama.service

Now you can point Msty to the models folder and it should restart the Local AI service.

At this point I had to close Msty and reopen it and it was ready to go and detected the model I had pulled with Ollama.

Originally created by @Dunravin on GitHub (Apr 12, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10246 I had a lot of head scratching getting Msty to share the Ollama models directory and retain normal Ollama functions. Posting my process here in the hopes of helping others get the job done. First you're going to need to create a folder for the models in your systems /home folder but **not in your users home folder** go up a level. `sudo mkdir /home/.ollama_models` give the directory to ollama user and group `sudo chown ollama:ollama /home/.ollama_models` and modify it's permissions to be readable by all users `sudo chmod -R a+r /home/.ollama_models` next modify the ollama.service environment variables with `sudo systemctl edit ollama.service` This will open a temporary overrides file for the ollama.service. Read what it says and enter in the top section. `[Service]` `Environment="OLLAMA_MODELS=/home/.ollama_models"` write out, confirm and quit the editor then `sudo systemctl daemon-reload` `sudo systemctl restart ollama` confirm your override is being applied and the service is running with `sudo systemctl status ollama.service` Now you can point Msty to the models folder and it should restart the Local AI service. At this point I had to close Msty and reopen it and it was ready to go and detected the model I had pulled with Ollama.
GiteaMirror added the documentation label 2026-04-12 18:28:24 -05:00
Author
Owner

@tdorzhi commented on GitHub (May 9, 2025):

thanks for this, I struggled to understand how to customize directory for storing models and this helped me

simply specifiying OLLAMA_MODELS didn't help me but your guide explained what else I had missing

<!-- gh-comment-id:2865272183 --> @tdorzhi commented on GitHub (May 9, 2025): thanks for this, I struggled to understand how to customize directory for storing models and this helped me simply specifiying OLLAMA_MODELS didn't help me but your guide explained what else I had missing
Author
Owner

@Dunravin commented on GitHub (May 11, 2025):

thanks for this, I struggled to understand how to customize directory for storing models and this helped me

simply specifiying OLLAMA_MODELS didn't help me but your guide explained what else I had missing

You're welcome, I just revisited it as I've done a fresh system install and the steps weren't working for me this time around, I got in a mess with user and group permissions and the service would not start so simplified the directory to one folder and chown it to ollama:ollama.

If you need to revert to default remember to delete the drop in override.conf

sudo rm /etc/systemd/system/ollama.service.d/override.conf and reload the daemon and restart/start ollama service.

I made the edits to my OP.

<!-- gh-comment-id:2870033926 --> @Dunravin commented on GitHub (May 11, 2025): > thanks for this, I struggled to understand how to customize directory for storing models and this helped me > > simply specifiying OLLAMA_MODELS didn't help me but your guide explained what else I had missing You're welcome, I just revisited it as I've done a fresh system install and the steps weren't working for me this time around, I got in a mess with user and group permissions and the service would not start so simplified the directory to one folder and chown it to ollama:ollama. If you need to revert to default remember to delete the drop in override.conf `sudo rm /etc/systemd/system/ollama.service.d/override.conf` and reload the daemon and restart/start ollama service. I made the edits to my OP.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6722