[GH-ISSUE #680] Is there a way to change the download/run directory? #62344

Closed
opened 2026-05-03 08:18:44 -05:00 by GiteaMirror · 42 comments
Owner

Originally created by @improvethings on GitHub (Oct 2, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/680

On Linux, I want to download/run it from a directory with more space than /usr/share/

Originally created by @improvethings on GitHub (Oct 2, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/680 On Linux, I want to download/run it from a directory with more space than /usr/share/
GiteaMirror added the feature request label 2026-05-03 08:18:45 -05:00
Author
Owner

@tsugabloom commented on GitHub (Oct 2, 2023):

It's just using $HOME as seen here:

https://github.com/search?q=repo%3Ajmorganca%2Follama+.ollama&type=code

The /usr/share is coming from Environment="HOME=/usr/share/ollama"

https://github.com/search?q=repo%3Ajmorganca%2Follama%20usr%2Fshare&type=code

<!-- gh-comment-id:1743770668 --> @tsugabloom commented on GitHub (Oct 2, 2023): It's just using `$HOME` as seen here: https://github.com/search?q=repo%3Ajmorganca%2Follama+.ollama&type=code The `/usr/share` is coming from `Environment="HOME=/usr/share/ollama"` https://github.com/search?q=repo%3Ajmorganca%2Follama%20usr%2Fshare&type=code
Author
Owner

@BruceMacD commented on GitHub (Oct 27, 2023):

@improvethings I think #897, which will be in the next release may resolve your issue. It allows setting the directory where the model files are stored.

<!-- gh-comment-id:1783503977 --> @BruceMacD commented on GitHub (Oct 27, 2023): @improvethings I think #897, which will be in the next release may resolve your issue. It allows setting the directory where the model files are stored.
Author
Owner

@adbrei commented on GitHub (Oct 31, 2023):

Do you know approximately when the next release will be deployed?

<!-- gh-comment-id:1787618292 --> @adbrei commented on GitHub (Oct 31, 2023): Do you know approximately when the next release will be deployed?
Author
Owner

@technovangelist commented on GitHub (Dec 4, 2023):

This is solved by using the OLLAMA_MODELS environment variable. Once you set that for the account that runs ollama, then models will go wherever you want. I will go ahead and close this issue now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.

<!-- gh-comment-id:1839356823 --> @technovangelist commented on GitHub (Dec 4, 2023): This is solved by using the OLLAMA_MODELS environment variable. Once you set that for the account that runs ollama, then models will go wherever you want. I will go ahead and close this issue now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.
Author
Owner

@mrektor commented on GitHub (Jan 18, 2024):

Could it be possible to set this download path from the ollama command line? Like ollama set-model-path /my/path or something like this...

<!-- gh-comment-id:1898970443 --> @mrektor commented on GitHub (Jan 18, 2024): Could it be possible to set this download path from the ollama command line? Like `ollama set-model-path /my/path` or something like this...
Author
Owner

@kac487 commented on GitHub (Feb 1, 2024):

I second this, I'm unable to get the OLLAMA_MODELS to make any difference to where models are stored. Having the ability to directly set the model path would be great.

<!-- gh-comment-id:1920742168 --> @kac487 commented on GitHub (Feb 1, 2024): I second this, I'm unable to get the OLLAMA_MODELS to make any difference to where models are stored. Having the ability to directly set the model path would be great.
Author
Owner

@amorphius commented on GitHub (Feb 1, 2024):

same issue. OLLAMA_MODELS variable does not work for me

<!-- gh-comment-id:1921521908 --> @amorphius commented on GitHub (Feb 1, 2024): same issue. OLLAMA_MODELS variable does not work for me
Author
Owner

@BananaAcid commented on GitHub (Feb 1, 2024):

It works for me, in the service by adding Environment="OLLAMA_MODELS=/srv/models" (models folder must be writable/owned by 'ollama' user) as well as (in one line) in the shell: $ OLLAMA_MODELS=/srv/models ollama run mixtral

<!-- gh-comment-id:1921541317 --> @BananaAcid commented on GitHub (Feb 1, 2024): It works for me, in the service by adding `Environment="OLLAMA_MODELS=/srv/models"` (models folder must be writable/owned by 'ollama' user) as well as (in one line) in the shell: $ `OLLAMA_MODELS=/srv/models ollama run mixtral`
Author
Owner

@amorphius commented on GitHub (Feb 2, 2024):

Sorry for confusion, it works now. In my case I had to start first ollama serve and only then ollama run .... So inititally I've passed OLLAMA_MODELS to ollama run command but it turned out that I need to pass it to ollama serve.
Now everything works fine for me.

<!-- gh-comment-id:1923376927 --> @amorphius commented on GitHub (Feb 2, 2024): Sorry for confusion, it works now. In my case I had to start first `ollama serve` and only then `ollama run ...`. So inititally I've passed `OLLAMA_MODELS` to `ollama run` command but it turned out that I need to pass it to `ollama serve`. Now everything works fine for me.
Author
Owner

@RiverHousePresents commented on GitHub (Feb 18, 2024):

For users of the newly released Windows version of Ollama, you need to add your new directory to both the "System variables" in the "Environment Variables" and in the "Path" under the "User variables for ***"

<!-- gh-comment-id:1951370092 --> @RiverHousePresents commented on GitHub (Feb 18, 2024): For users of the newly released Windows version of Ollama, you need to add your new directory to both the "System variables" in the "Environment Variables" and in the "Path" under the "User variables for ***"
Author
Owner

@macmus82 commented on GitHub (Feb 20, 2024):

For users of the newly released Windows version of Ollama, you need to add your new directory to both the "System variables" in the "Environment Variables" and in the "Path" under the "User variables for ***"

How did u install model in other directory then c? I don't have enough storage in windows on drive c and I'm not able to download it to other directory

<!-- gh-comment-id:1953294219 --> @macmus82 commented on GitHub (Feb 20, 2024): > For users of the newly released Windows version of Ollama, you need to add your new directory to both the "System variables" in the "Environment Variables" and in the "Path" under the "User variables for ***" How did u install model in other directory then c? I don't have enough storage in windows on drive c and I'm not able to download it to other directory
Author
Owner

@ArbyC commented on GitHub (Mar 5, 2024):

Similar concern on how do I install or download models to a different directory then C which seems to be the default for both installing ollama and run model.

<!-- gh-comment-id:1979281753 --> @ArbyC commented on GitHub (Mar 5, 2024): Similar concern on how do I install or download models to a different directory then C which seems to be the default for both installing ollama and run model.
Author
Owner

@bryanhughes commented on GitHub (Mar 11, 2024):

This does not work for me. I second the feature of a command line to formally set the path. I am on a linux instance and the installer creates /usr/share/ollama as the home directory without a shell which contains .bashrc and .profile. I tried adding the OLLAMA_MODELS=... and nothing is written to the new location. :(

<!-- gh-comment-id:1989101824 --> @bryanhughes commented on GitHub (Mar 11, 2024): This does not work for me. I second the feature of a command line to formally set the path. I am on a linux instance and the installer creates `/usr/share/ollama` as the home directory without a shell which contains `.bashrc` and `.profile`. I tried adding the `OLLAMA_MODELS=...` and nothing is written to the new location. :(
Author
Owner

@AnandBhandari1 commented on GitHub (Mar 14, 2024):

On windows, it worked perfectly. setx OLLAMA_MODELS "D:\ollama_model"

<!-- gh-comment-id:1996562577 --> @AnandBhandari1 commented on GitHub (Mar 14, 2024): On windows, it worked perfectly. setx OLLAMA_MODELS "D:\ollama_model"
Author
Owner

@jay-singhvi commented on GitHub (Mar 22, 2024):

Can we have a way to store the model at custom paths for each model, like specifying the path when its being downloaded for first time.
If the model is not there already then download and run, else directly run.
In this way we can even maintain different versions of same model in different directories.
I am really hoping this would work for windows too :)

<!-- gh-comment-id:2016149592 --> @jay-singhvi commented on GitHub (Mar 22, 2024): Can we have a way to store the model at custom paths for each model, like specifying the path when its being downloaded for first time. If the model is not there already then download and run, else directly run. In this way we can even maintain different versions of same model in different directories. I am really hoping this would work for windows too :)
Author
Owner

@ElderMedic commented on GitHub (Mar 23, 2024):

It’s so counter-intuitive that ollama pull cannot set the location of the downloaded model through an optional parameters, actually all ollama commands basically have no flag.
I believe most linux user does not use /usr/share to store data as large as LLM.
Please consider something like adding a --out for pull and --in for run, it would be perfect.

<!-- gh-comment-id:2016315841 --> @ElderMedic commented on GitHub (Mar 23, 2024): It’s so counter-intuitive that ollama pull cannot set the location of the downloaded model through an optional parameters, actually all ollama commands basically have no flag. I believe most linux user does not use /usr/share to store data as large as LLM. Please consider something like adding a --out for pull and --in for run, it would be perfect.
Author
Owner

@chriskosinski commented on GitHub (Apr 3, 2024):

I also had problems, it's not very intuitive. I put the variable everywhere and made extra sure everything under new /ollama/ folder is writable and owned by ollama user. Steps:

  1. https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server and set Environment = OLLAMA_MODELS="/folder/ollama/models/"
  2. sudo chown -R ollama:ollama /folder/ollama/
  3. sudo chmod -R 775 /folder/ollama/
  4. OLLAMA_MODELS="/folder/ollama/models/" ollama serve
  5. OLLAMA_MODELS="/folder/ollama/models/" ollama run llama2
<!-- gh-comment-id:2035084286 --> @chriskosinski commented on GitHub (Apr 3, 2024): I also had problems, it's not very intuitive. I put the variable everywhere and made extra sure everything under new /ollama/ folder is writable and owned by ollama user. Steps: 1. https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server and set Environment = OLLAMA_MODELS="/folder/ollama/models/" 2. sudo chown -R ollama:ollama /folder/ollama/ 3. sudo chmod -R 775 /folder/ollama/ 4. OLLAMA_MODELS="/folder/ollama/models/" ollama serve 5. OLLAMA_MODELS="/folder/ollama/models/" ollama run llama2
Author
Owner

@satangel2222 commented on GitHub (May 2, 2024):

How To Change Ollama Model Default Directory
https://www.youtube.com/watch?v=uj1VnDPR9xo

sharing this hope you guys will solve your problem

<!-- gh-comment-id:2089674398 --> @satangel2222 commented on GitHub (May 2, 2024): How To Change Ollama Model Default Directory https://www.youtube.com/watch?v=uj1VnDPR9xo sharing this hope you guys will solve your problem
Author
Owner

@vector4wang commented on GitHub (May 14, 2024):

On windows, it worked perfectly. setx OLLAMA_MODELS "D:\ollama_model"

it worked for me on windows

<!-- gh-comment-id:2109758382 --> @vector4wang commented on GitHub (May 14, 2024): > On windows, it worked perfectly. setx OLLAMA_MODELS "D:\ollama_model" it worked for me on windows
Author
Owner

@igadmg commented on GitHub (Jun 23, 2024):

For everyone who say it does not work OLLAMA_MODELS - kill ollama instances first, then run it again

<!-- gh-comment-id:2185176186 --> @igadmg commented on GitHub (Jun 23, 2024): For everyone who say it does not work OLLAMA_MODELS - kill ollama instances first, then run it again
Author
Owner

@yoyojacky commented on GitHub (Jul 12, 2024):

Just a new idea, i have attached a m.2 PCIe NVME SSD to my Raspberry Pi 5 and it run Raspberry Pi OS 64bit (bookworm),
I found that ollama user's home directory is /usr/share/ollama , so I've tried this :

    1. stop ollama service
sudo systemctl stop ollama.service
    1. patitioning my SSD and format it, mount it direct to /usr/share/ollama folder .
sudo fdisk /dev/nvme0n1 
> d
> p
> n 
> p
> 1 
> enter -->enter 
> w 

sudo mkfs.ext4  /dev/nvme0n1p1 
sudo mount -t ext4 /dev/nvme0n1p1  /usr/share/ollama -v 
sudo chown -R ollama:ollama /usr/share/ollama 
sudo systemctl daemon-reload
sudo systemctl restart ollama.service
    1. Pull models
for  model in llama3 llama2  mistral phi3  qwen:4b  qwen:7b  codegema
do 
     ollama pull $model 
done 

and it works perfect. hope it can help you out.

<!-- gh-comment-id:2224334739 --> @yoyojacky commented on GitHub (Jul 12, 2024): Just a new idea, i have attached a m.2 PCIe NVME SSD to my Raspberry Pi 5 and it run Raspberry Pi OS 64bit (bookworm), I found that ollama user's home directory is `/usr/share/ollama` , so I've tried this : * 1. stop ollama service ```bash sudo systemctl stop ollama.service ``` * 2. patitioning my SSD and format it, mount it direct to `/usr/share/ollama` folder . ```bash sudo fdisk /dev/nvme0n1 > d > p > n > p > 1 > enter -->enter > w sudo mkfs.ext4 /dev/nvme0n1p1 sudo mount -t ext4 /dev/nvme0n1p1 /usr/share/ollama -v sudo chown -R ollama:ollama /usr/share/ollama sudo systemctl daemon-reload sudo systemctl restart ollama.service ``` * 3. Pull models ```bash for model in llama3 llama2 mistral phi3 qwen:4b qwen:7b codegema do ollama pull $model done ``` and it works perfect. hope it can help you out.
Author
Owner

@Eyalm321 commented on GitHub (Jul 30, 2024):

It’s so counter-intuitive that ollama pull cannot set the location of the downloaded model through an optional parameters, actually all ollama commands basically have no flag. I believe most linux user does not use /usr/share to store data as large as LLM. Please consider something like adding a --out for pull and --in for run, it would be perfect.

this

<!-- gh-comment-id:2257648962 --> @Eyalm321 commented on GitHub (Jul 30, 2024): > It’s so counter-intuitive that ollama pull cannot set the location of the downloaded model through an optional parameters, actually all ollama commands basically have no flag. I believe most linux user does not use /usr/share to store data as large as LLM. Please consider something like adding a --out for pull and --in for run, it would be perfect. this
Author
Owner

@yordis commented on GitHub (Jan 6, 2025):

If you are using the MacOS app, make sure to close and reopen the app to pick up the new secret value, or any other method that allows the server to use the new secret value

<!-- gh-comment-id:2572200831 --> @yordis commented on GitHub (Jan 6, 2025): If you are using the MacOS app, make sure to close and reopen the app to pick up the new secret value, or any other method that allows the server to use the new secret value
Author
Owner

@chriskosinski commented on GitHub (Jan 19, 2025):

To run as a service in other directory:

  1. sudo vi /etc/systemd/system/ollama.service
  2. Change
    Environment="PATH=/home/user/anaconda3/condabin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/bin:/usr/bin"
    to
    Environment="PATH=/home/user/anaconda3/condabin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/bin:/usr/bin" "OLLAMA_MODELS=/mnt/mydrive/ollama/models"
  3. save, sudo systemctl daemon-reload
  4. sudo systemctl force-reload ollama.service
  5. sudo systemctl start ollama.service

Now you have ollama running as a service in a chosen directory.

<!-- gh-comment-id:2601029391 --> @chriskosinski commented on GitHub (Jan 19, 2025): To run as a service in other directory: 1. sudo vi /etc/systemd/system/ollama.service 2. Change Environment="PATH=/home/user/anaconda3/condabin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/bin:/usr/bin" to Environment="PATH=/home/user/anaconda3/condabin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/bin:/usr/bin" "OLLAMA_MODELS=/mnt/mydrive/ollama/models" 3. save, sudo systemctl daemon-reload 4. sudo systemctl force-reload ollama.service 5. sudo systemctl start ollama.service Now you have ollama running as a service in a chosen directory.
Author
Owner

@denohk commented on GitHub (Jan 30, 2025):

Win10:
Press Windows key on your keyboard search: Edit the system environment > Environment Variables >New
Variable Name: OLLAMA_MODELS
Variable value: put/paste the folder you like to here, logout the windows 10 and then start the OLLAMA servers again you are now can save your models into the folder you have just set up. done.

Image

<!-- gh-comment-id:2623603410 --> @denohk commented on GitHub (Jan 30, 2025): Win10: Press Windows key on your keyboard search: Edit the system environment > Environment Variables >New Variable Name: OLLAMA_MODELS Variable value: put/paste the folder you like to here, logout the windows 10 and then start the OLLAMA servers again you are now can save your models into the folder you have just set up. done. ![Image](https://github.com/user-attachments/assets/06bcdd4c-2ed6-4a1d-868c-0986756b70c5)
Author
Owner

@Ramadanko commented on GitHub (Feb 2, 2025):

OLLAMA_MODELS

after setting the env variable, quit the ollama app and wait for about 30 seconds, then open it then open the terminal and pull whatever you want!

<!-- gh-comment-id:2629460955 --> @Ramadanko commented on GitHub (Feb 2, 2025): > OLLAMA_MODELS after setting the env variable, quit the ollama app and wait for about 30 seconds, then open it then open the terminal and pull whatever you want!
Author
Owner

@ikskoder commented on GitHub (Feb 4, 2025):

Changing /etc/systemd/system/ollama.service on Ubuntu didn't work for me, but declaring env variable just before running ollama does the trick:

export OLLAMA_MODELS=/path/you/want
ollama serve
<!-- gh-comment-id:2632892380 --> @ikskoder commented on GitHub (Feb 4, 2025): Changing /etc/systemd/system/ollama.service on Ubuntu didn't work for me, but declaring env variable just before running ollama does the trick: ``` export OLLAMA_MODELS=/path/you/want ollama serve ```
Author
Owner

@bkrajendra commented on GitHub (Feb 6, 2025):

Following worked for me on macOS: https://github.com/ollama/ollama/blob/main/docs%2Ffaq.md#setting-environment-variables-on-mac

Fore setting host

launchctl setenv OLLAMA_HOST "0.0.0.0"

For changing model directory

launchctl setenv OLLAMA_MODELS "/Users/<user>/Documents/llm_models"

Restart the ollama app.

<!-- gh-comment-id:2638946595 --> @bkrajendra commented on GitHub (Feb 6, 2025): Following worked for me on macOS: https://github.com/ollama/ollama/blob/main/docs%2Ffaq.md#setting-environment-variables-on-mac ### Fore setting host `launchctl setenv OLLAMA_HOST "0.0.0.0"` ### For changing model directory `launchctl setenv OLLAMA_MODELS "/Users/<user>/Documents/llm_models"` Restart the ollama app.
Author
Owner

@giantamoeba commented on GitHub (Feb 7, 2025):

The following worked on linux for me:

sudo systemctl edit ollama.service
And then in the specified location write:
[Service]
Environment="OLLAMA_MODELS=/mnt/mymodels_dir"
Save and then restart the service
sudo systemctl stop ollama.service
sudo systemctl start ollama.service

<!-- gh-comment-id:2642270935 --> @giantamoeba commented on GitHub (Feb 7, 2025): The following worked on linux for me: `sudo systemctl edit ollama.service` And then in the specified location write: `[Service]` `Environment="OLLAMA_MODELS=/mnt/mymodels_dir"` Save and then restart the service `sudo systemctl stop ollama.service` `sudo systemctl start ollama.service`
Author
Owner

@kabyanil commented on GitHub (Mar 11, 2025):

As of March 11 2025, adding export OLLAMA_MODELS="custom/path/to/ollama-models" to .bashrc or .zshrc does not work. Setting ownership to ollama:ollama to the folder ollama-models doesn't work. Setting Environment="OLLAMA_MODELS=custom/path/to/ollama-models" in ollama.service file doesn't work.

Downloading an LLM to another folder is a basic requirement. Considering ollama is so extensively used and maintained, this should have easily worked.

<!-- gh-comment-id:2714489017 --> @kabyanil commented on GitHub (Mar 11, 2025): As of March 11 2025, adding ```export OLLAMA_MODELS="custom/path/to/ollama-models"``` to .bashrc or .zshrc does not work. Setting ownership to ollama:ollama to the folder ollama-models doesn't work. Setting Environment="OLLAMA_MODELS=custom/path/to/ollama-models" in ollama.service file doesn't work. Downloading an LLM to another folder is a basic requirement. Considering ollama is so extensively used and maintained, this should have easily worked.
Author
Owner

@bkrajendra commented on GitHub (Mar 11, 2025):

As of March 11 2025, adding export OLLAMA_MODELS="custom/path/to/ollama-models" to .bashrc or .zshrc does not work. Setting ownership to ollama:ollama to the folder ollama-models doesn't work. Setting Environment="OLLAMA_MODELS=custom/path/to/ollama-models" in ollama.service file doesn't work.

Downloading an LLM to another folder is a basic requirement. Considering ollama is so extensively used and maintained, this should have easily worked.

Can you make sure you restarted the ollama service.
Also check if these environment variables are set and accessible.

I am using the latest version, for me it works on windows, mac and Linux without any issue.

<!-- gh-comment-id:2714816714 --> @bkrajendra commented on GitHub (Mar 11, 2025): > As of March 11 2025, adding ```export OLLAMA_MODELS="custom/path/to/ollama-models"``` to .bashrc or .zshrc does not work. Setting ownership to ollama:ollama to the folder ollama-models doesn't work. Setting Environment="OLLAMA_MODELS=custom/path/to/ollama-models" in ollama.service file doesn't work. > > Downloading an LLM to another folder is a basic requirement. Considering ollama is so extensively used and maintained, this should have easily worked. > Can you make sure you restarted the ollama service. Also check if these environment variables are set and accessible. I am using the latest version, for me it works on windows, mac and Linux without any issue.
Author
Owner

@kabyanil commented on GitHub (Mar 13, 2025):

As of March 11 2025, adding export OLLAMA_MODELS="custom/path/to/ollama-models" to .bashrc or .zshrc does not work. Setting ownership to ollama:ollama to the folder ollama-models doesn't work. Setting Environment="OLLAMA_MODELS=custom/path/to/ollama-models" in ollama.service file doesn't work.
Downloading an LLM to another folder is a basic requirement. Considering ollama is so extensively used and maintained, this should have easily worked.

Can you make sure you restarted the ollama service. Also check if these environment variables are set and accessible.

I am using the latest version, for me it works on windows, mac and Linux without any issue.

Yes, I restarted the ollama service. I also checked if the env variables were accessible by printing them on the terminal.

Can you share your configuration steps?

<!-- gh-comment-id:2720058933 --> @kabyanil commented on GitHub (Mar 13, 2025): > > As of March 11 2025, adding `export OLLAMA_MODELS="custom/path/to/ollama-models"` to .bashrc or .zshrc does not work. Setting ownership to ollama:ollama to the folder ollama-models doesn't work. Setting Environment="OLLAMA_MODELS=custom/path/to/ollama-models" in ollama.service file doesn't work. > > Downloading an LLM to another folder is a basic requirement. Considering ollama is so extensively used and maintained, this should have easily worked. > > Can you make sure you restarted the ollama service. Also check if these environment variables are set and accessible. > > I am using the latest version, for me it works on windows, mac and Linux without any issue. Yes, I restarted the ollama service. I also checked if the env variables were accessible by printing them on the terminal. Can you share your configuration steps?
Author
Owner

@erickmiller commented on GitHub (Mar 15, 2025):

I'm on linux. I just ran this to install the latest version:

curl -fsSL https://ollama.com/install.sh | sh

And then just threw this in my .bashrc to keep it simple for testing in a shell:

export OLLAMA_MODELS=/path/to/my/ollama/models

if ! pgrep -f "ollama serve" > /dev/null; then
	nohup ollama serve > /dev/null 2>&1 & disown 
fi

And everything seems to work fine, ollama silently starts in my shell and models download to the custom directory when calling ollama run for the first time, for example ollama run gemma3 downloads the default gemma3 model to the custom path set in OLLAMA_MODELS under the folders ollama makes (the blobs and manifests sub-directories). I didn't have to chown to ollama:ollama and I didn't have to restart the service. If the service is already running maybe you need to restart it, but might as well just put these lines in your .bashrc and manually curl down and run the installer shell script to get the latest version anyways. Good luck and happy local inferencing.

<!-- gh-comment-id:2726204884 --> @erickmiller commented on GitHub (Mar 15, 2025): I'm on linux. I just ran this to install the latest version: `curl -fsSL https://ollama.com/install.sh | sh` And then just threw this in my `.bashrc` to keep it simple for testing in a shell: ``` export OLLAMA_MODELS=/path/to/my/ollama/models if ! pgrep -f "ollama serve" > /dev/null; then nohup ollama serve > /dev/null 2>&1 & disown fi ``` And everything seems to work fine, ollama silently starts in my shell and models download to the custom directory when calling `ollama run` for the first time, for example `ollama run gemma3` downloads the default gemma3 model to the custom path set in `OLLAMA_MODELS` under the folders ollama makes (the `blobs` and `manifests` sub-directories). I didn't have to `chown` to `ollama:ollama` and I didn't have to restart the service. If the service is already running maybe you need to restart it, but might as well just put these lines in your `.bashrc` and manually curl down and run the installer shell script to get the latest version anyways. Good luck and happy local inferencing.
Author
Owner

@anuprulez commented on GitHub (Mar 27, 2025):

For me also setting OLLAMA_MODELS path did not work. I went for the manual installation guide (https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install) and then set the path /usr mentioned in the command to my custom path while extracting the package. Then, ollama pull llama2 pulled the the Llama2 to the path I provided. Hope it helps

<!-- gh-comment-id:2758067271 --> @anuprulez commented on GitHub (Mar 27, 2025): For me also setting OLLAMA_MODELS path did not work. I went for the manual installation guide (https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install) and then set the path `/usr` mentioned in the command to my custom path while extracting the package. Then, `ollama pull llama2` pulled the the Llama2 to the path I provided. Hope it helps
Author
Owner

@Binjian commented on GitHub (Apr 14, 2025):

I'm in Linux and so I move the "models" folder in "/usr/share/ollama/.ollama" to the desired partition and add a softlink in the original place pointing to the new folder. It seems to work. Would this be a problem in the future?

<!-- gh-comment-id:2800916297 --> @Binjian commented on GitHub (Apr 14, 2025): I'm in Linux and so I move the "models" folder in "/usr/share/ollama/.ollama" to the desired partition and add a softlink in the original place pointing to the new folder. It seems to work. Would this be a problem in the future?
Author
Owner

@StewartSethA commented on GitHub (Apr 19, 2025):

ln -s /your/model/path ~/.ollama/

<!-- gh-comment-id:2816500755 --> @StewartSethA commented on GitHub (Apr 19, 2025): `ln -s /your/model/path ~/.ollama/`
Author
Owner

@jonas-eschle commented on GitHub (Apr 22, 2025):

@Binjian you can follow the instructions here to set the environment variables: https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored

<!-- gh-comment-id:2821291475 --> @jonas-eschle commented on GitHub (Apr 22, 2025): @Binjian you can follow the instructions here to set the environment variables: https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored
Author
Owner

@Gene1125 commented on GitHub (May 12, 2025):

I try to set the environment variables by editing /etc/systemd/system/ollama.service and it failed at systemctl status ollama.
Then I directly add export OLLAMA_MODELS="custom/path/to/ollama-models" , it still doesn't work.
Finally I use soft link ln -s /mnt/ollama/models /usr/share/ollama/.ollama/models and it works fine.
Hope that helps :D

<!-- gh-comment-id:2870892137 --> @Gene1125 commented on GitHub (May 12, 2025): I try to set the environment variables by editing /etc/systemd/system/ollama.service and it failed at `systemctl status ollama`. Then I directly add export OLLAMA_MODELS="custom/path/to/ollama-models" , it still doesn't work. Finally I use soft link `ln -s /mnt/ollama/models /usr/share/ollama/.ollama/models` and it works fine. Hope that helps :D
Author
Owner

@jclsn commented on GitHub (May 14, 2025):

@bryanughes You are probably running ollama via systemd, so try doing a

sudo systemctl edit ollama.service

and then paste this inside

[Service]
Environment="OLLAMA_MODELS=/home/user/.ollama/models"

Exchange user with your username! Then do

sudo systemctl daemon-reload
sudo systemctl restart ollama

This will overwrite the systemd service that automatically starts ollama.

<!-- gh-comment-id:2880768673 --> @jclsn commented on GitHub (May 14, 2025): @bryanughes You are probably running ollama via systemd, so try doing a ```bash sudo systemctl edit ollama.service ``` and then paste this inside ```systemd [Service] Environment="OLLAMA_MODELS=/home/user/.ollama/models" ``` Exchange ``user`` with your username! Then do ``` sudo systemctl daemon-reload sudo systemctl restart ollama ``` This will overwrite the systemd service that automatically starts ollama.
Author
Owner

@ahashem commented on GitHub (Aug 3, 2025):

Following worked for me on macOS: https://github.com/ollama/ollama/blob/main/docs%2Ffaq.md#setting-environment-variables-on-mac

Fore setting host

launchctl setenv OLLAMA_HOST "0.0.0.0"

For changing model directory

launchctl setenv OLLAMA_MODELS "/Users/<user>/Documents/llm_models"

Restart the ollama app.

I can confirm this works on macOS. Additionally if Ollama were installed using homebrew:

launchctl setenv OLLAMA_MODELS "/path/to/external/volume/llm_models"

then: brew services restart ollama to restart the ollama app.

<!-- gh-comment-id:3148568697 --> @ahashem commented on GitHub (Aug 3, 2025): > Following worked for me on macOS: https://github.com/ollama/ollama/blob/main/docs%2Ffaq.md#setting-environment-variables-on-mac > > ### Fore setting host > > `launchctl setenv OLLAMA_HOST "0.0.0.0"` > > ### For changing model directory > > `launchctl setenv OLLAMA_MODELS "/Users/<user>/Documents/llm_models"` > > Restart the ollama app. I can confirm this works on macOS. Additionally if Ollama were installed using homebrew: `launchctl setenv OLLAMA_MODELS "/path/to/external/volume/llm_models"` then: `brew services restart ollama` to restart the ollama app.
Author
Owner

@timebinding-sinus commented on GitHub (Sep 25, 2025):

// Ollama 'config' catalog in "~/.ollama"
//
//

// Go to Home catalog
cd ~

// sure make catalogs
// where you planning save new models

// i use mounted 500GB harddisk and safe big models
mkdir ~/Hard_500G
mount /dev/sda1 ~/Hard_500G

// create catalogs (you can change it)
mkdir ~/Hard_500G/AI/
mkdir ~/Hard_500G/AI/Ollama/
mkdir ~/Hard_500G/AI/Ollama/models

// move saved models in new place (not delete)
mv .ollama/models ~/Hard/AI/Ollama/models

// create symbolic link to new place
ln -s /home/USERNAME/Hard_500G/AI/Ollama/models ~/.ollama/models

// Install (run, pull) new model
ollama serve

// (open new terminal)

ollama run gemma3

<!-- gh-comment-id:3333402148 --> @timebinding-sinus commented on GitHub (Sep 25, 2025): // Ollama 'config' catalog in "~/.ollama" // // // Go to Home catalog cd ~ // sure make catalogs // where you planning save new models // i use mounted 500GB harddisk and safe big models mkdir ~/Hard_500G mount /dev/sda1 ~/Hard_500G // create catalogs (you can change it) mkdir ~/Hard_500G/AI/ mkdir ~/Hard_500G/AI/Ollama/ mkdir ~/Hard_500G/AI/Ollama/models // move saved models in new place (not delete) mv .ollama/models ~/Hard/AI/Ollama/models // create symbolic link to new place ln -s /home/USERNAME/Hard_500G/AI/Ollama/models ~/.ollama/models // Install (run, pull) new model ollama serve // (open new terminal) ollama run gemma3
Author
Owner

@Kraven1109 commented on GitHub (Feb 10, 2026):

@echo off
:: Set the custom path for your models (for those who using windows like me, models will be stored in e:\llm\blobs)
set OLLAMA_MODELS=e:\llm\

:: Navigate to the directory where ollama is installed (optional but recommended, if you don't have ollama in PATH)
cd /d "d:\Apps\ollama"

:: Start the Ollama server
echo Starting Ollama with models at E:\llm...
ollama serve
pause

<!-- gh-comment-id:3878950711 --> @Kraven1109 commented on GitHub (Feb 10, 2026): @echo off :: Set the custom path for your models (for those who using windows like me, models will be stored in e:\llm\blobs) set OLLAMA_MODELS=e:\llm\ :: Navigate to the directory where ollama is installed (optional but recommended, if you don't have ollama in PATH) cd /d "d:\Apps\ollama\" :: Start the Ollama server echo Starting Ollama with models at E:\llm... ollama serve pause
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#62344