[GH-ISSUE #3719] How do I download an AI model to external storage and run it? #2287

Closed
opened 2026-04-12 12:33:24 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @manfar on GitHub (Apr 18, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3719

Originally assigned to: @dhiltgen on GitHub.

For existing Mac computers with insufficient hard disk space, how to download the model to an external SSD drive for running instead of storing it on the computer itself. This way you can install more models and run them faster. It also supports path searching and finder viewing of the download storage.

And how do I find the path to the models I have downloaded to my Mac computer, I can't find where the models I have downloaded are.
Originally created by @manfar on GitHub (Apr 18, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3719 Originally assigned to: @dhiltgen on GitHub. For existing Mac computers with insufficient hard disk space, how to download the model to an external SSD drive for running instead of storing it on the computer itself. This way you can install more models and run them faster. It also supports path searching and finder viewing of the download storage. And how do I find the path to the models I have downloaded to my Mac computer, I can't find where the models I have downloaded are.
GiteaMirror added the feature request label 2026-04-12 12:33:24 -05:00
Author
Owner

@thinkverse commented on GitHub (Apr 18, 2024):

Information about default model directories and how to change that is available in the FAQ1 .

8645076a71/docs/faq.md (L139-L149)

<!-- gh-comment-id:2062873856 --> @thinkverse commented on GitHub (Apr 18, 2024): Information about default model directories and how to change that is available in the FAQ[^1]. https://github.com/ollama/ollama/blob/8645076a71941d78a996e52cff65c794df6cdbcb/docs/faq.md?plain=1#L139-L149 [^1]: https://github.com/ollama/ollama/blob/8645076a71941d78a996e52cff65c794df6cdbcb/docs/faq.md#where-are-models-stored
Author
Owner

@igorschlum commented on GitHub (Apr 18, 2024):

Hi @manfar if the answer solve the issue, could you close this ticket?

<!-- gh-comment-id:2062886211 --> @igorschlum commented on GitHub (Apr 18, 2024): Hi @manfar if the answer solve the issue, could you close this ticket?
Author
Owner

@dhiltgen commented on GitHub (May 5, 2024):

If you're still having trouble, let us know.

<!-- gh-comment-id:2094514746 --> @dhiltgen commented on GitHub (May 5, 2024): If you're still having trouble, let us know.
Author
Owner

@sonnyjlewis commented on GitHub (Jun 13, 2024):

Here is the solution that should work for people using the Mac platform.

Stop the Ollama process:

sudo killall ollama

Set the Variable (This probably won't work but sent me down a rabbit hole that had the right solution):

launchctl setenv OLLAMA_MODELS "/Volumes/YourExternalVolume/location/path/"

You'll need to make sure you provide the correct external volume drive name and path. This process most likely won't work, so if it's a no go for you, continue reading

What Did Work:

sudo nano /etc/launchd.conf

use nano, vi, whatever you want to edit. If the file is empty, that's OK too, it might not already exist.

Add the following:

# Set environment variables here so they are available globally to all apps
# (and Terminal), including those launched via Spotlight.
#
# After editing this file run the following command from the terminal to update
# environment variables globally without needing to reboot.
# NOTE: You will still need to restart the relevant application (including
# Terminal) to pick up the changes!
# grep -E "^setenv" /etc/launchd.conf | xargs -t -L 1 launchctl
#
# See http://www.digitaledgesw.com/node/31
# and http://stackoverflow.com/questions/135688/setting-environment-variables-in-os-x/
#
# Note that you must hardcode the paths below, don't use environment variables.
# You also need to surround multiple values in quotes.
#
setenv OLLAMA_HOME /Applications/Ollama.app
setenv OLLAMA_MODEL /Users/yourusername/.ollama /Volumes/externalmtpoint/morefoldersifdesired/.ollama

change yourusername to your local machine user's username if you installed via the GUI installer. Change externalmtpoint/morefoldersifdesired to whatever directories on your external mount point you desire.

Reboot and enjoy.

<!-- gh-comment-id:2166448435 --> @sonnyjlewis commented on GitHub (Jun 13, 2024): Here is the solution that should work for people using the Mac platform. ### Stop the Ollama process: `sudo killall ollama` ### Set the Variable (This probably won't work but sent me down a rabbit hole that had the right solution): `launchctl setenv OLLAMA_MODELS "/Volumes/YourExternalVolume/location/path/" ` _You'll need to make sure you provide the correct external volume drive name and path. This process most likely won't work, so if it's a no go for you, continue reading_ ### What Did Work: `sudo nano /etc/launchd.conf` use nano, vi, whatever you want to edit. If the file is empty, that's OK too, it might not already exist. Add the following: ``` # Set environment variables here so they are available globally to all apps # (and Terminal), including those launched via Spotlight. # # After editing this file run the following command from the terminal to update # environment variables globally without needing to reboot. # NOTE: You will still need to restart the relevant application (including # Terminal) to pick up the changes! # grep -E "^setenv" /etc/launchd.conf | xargs -t -L 1 launchctl # # See http://www.digitaledgesw.com/node/31 # and http://stackoverflow.com/questions/135688/setting-environment-variables-in-os-x/ # # Note that you must hardcode the paths below, don't use environment variables. # You also need to surround multiple values in quotes. # setenv OLLAMA_HOME /Applications/Ollama.app setenv OLLAMA_MODEL /Users/yourusername/.ollama /Volumes/externalmtpoint/morefoldersifdesired/.ollama ``` change yourusername to your local machine user's username if you installed via the GUI installer. Change externalmtpoint/morefoldersifdesired to whatever directories on your external mount point you desire. Reboot and enjoy.
Author
Owner

@igorschlum commented on GitHub (Jun 15, 2024):

Hi @thinkverse, sonnyjlewis,

I think that speed of loading LLMs (Large Language Models) is important
since files are very large. When you do this, do you see a difference when
you load a model in memory from an external source compared to one from an
internal source?

It could be interesting for a school or a company to load models into a
read-only directory so each student or employee can only use the models
proposed by the company.

<!-- gh-comment-id:2169301979 --> @igorschlum commented on GitHub (Jun 15, 2024): Hi @thinkverse, sonnyjlewis, I think that speed of loading LLMs (Large Language Models) is important since files are very large. When you do this, do you see a difference when you load a model in memory from an external source compared to one from an internal source? It could be interesting for a school or a company to load models into a read-only directory so each student or employee can only use the models proposed by the company.
Author
Owner

@sonnyjlewis commented on GitHub (Jun 15, 2024):

I’m using an external USB3.1 (C) M.2 nvme enclosure and I have had zero issues with speed. It’s s responsive as the on-board drive.

<!-- gh-comment-id:2169622970 --> @sonnyjlewis commented on GitHub (Jun 15, 2024): I’m using an external USB3.1 (C) M.2 nvme enclosure and I have had zero issues with speed. It’s s responsive as the on-board drive.
Author
Owner

@yy7054wyq5 commented on GitHub (Jan 7, 2025):

Here is the solution that should work for people using the Mac platform.

Stop the Ollama process:

sudo killall ollama

Set the Variable (This probably won't work but sent me down a rabbit hole that had the right solution):

launchctl setenv OLLAMA_MODELS "/Volumes/YourExternalVolume/location/path/"

You'll need to make sure you provide the correct external volume drive name and path. This process most likely won't work, so if it's a no go for you, continue reading

What Did Work:

sudo nano /etc/launchd.conf

use nano, vi, whatever you want to edit. If the file is empty, that's OK too, it might not already exist.

Add the following:

# Set environment variables here so they are available globally to all apps
# (and Terminal), including those launched via Spotlight.
#
# After editing this file run the following command from the terminal to update
# environment variables globally without needing to reboot.
# NOTE: You will still need to restart the relevant application (including
# Terminal) to pick up the changes!
# grep -E "^setenv" /etc/launchd.conf | xargs -t -L 1 launchctl
#
# See http://www.digitaledgesw.com/node/31
# and http://stackoverflow.com/questions/135688/setting-environment-variables-in-os-x/
#
# Note that you must hardcode the paths below, don't use environment variables.
# You also need to surround multiple values in quotes.
#
setenv OLLAMA_HOME /Applications/Ollama.app
setenv OLLAMA_MODEL /Users/yourusername/.ollama /Volumes/externalmtpoint/morefoldersifdesired/.ollama

change yourusername to your local machine user's username if you installed via the GUI installer. Change externalmtpoint/morefoldersifdesired to whatever directories on your external mount point you desire.

Reboot and enjoy.

In my Mac Mini4 ,i use this

setenv OLLAMA_HOME /Applications/Ollama.app
setenv OLLAMA_MODELS /Volumes/Kingston/models

If anyone still encounters problems, you can go check the logs. The path is /Users/yourusername/.ollama/logs/server.log

At the bottom of the file, you will see the latest log.

2025/01/07 21:05:16 routes.go:1259: INFO server config env="map[HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/Volumes/Kingston/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false http_proxy: https_proxy: no_proxy:]"
<!-- gh-comment-id:2575255575 --> @yy7054wyq5 commented on GitHub (Jan 7, 2025): > Here is the solution that should work for people using the Mac platform. > > ### Stop the Ollama process: > `sudo killall ollama` > > ### Set the Variable (This probably won't work but sent me down a rabbit hole that had the right solution): > `launchctl setenv OLLAMA_MODELS "/Volumes/YourExternalVolume/location/path/" ` > > _You'll need to make sure you provide the correct external volume drive name and path. This process most likely won't work, so if it's a no go for you, continue reading_ > > ### What Did Work: > `sudo nano /etc/launchd.conf` > > use nano, vi, whatever you want to edit. If the file is empty, that's OK too, it might not already exist. > > Add the following: > > ``` > # Set environment variables here so they are available globally to all apps > # (and Terminal), including those launched via Spotlight. > # > # After editing this file run the following command from the terminal to update > # environment variables globally without needing to reboot. > # NOTE: You will still need to restart the relevant application (including > # Terminal) to pick up the changes! > # grep -E "^setenv" /etc/launchd.conf | xargs -t -L 1 launchctl > # > # See http://www.digitaledgesw.com/node/31 > # and http://stackoverflow.com/questions/135688/setting-environment-variables-in-os-x/ > # > # Note that you must hardcode the paths below, don't use environment variables. > # You also need to surround multiple values in quotes. > # > setenv OLLAMA_HOME /Applications/Ollama.app > setenv OLLAMA_MODEL /Users/yourusername/.ollama /Volumes/externalmtpoint/morefoldersifdesired/.ollama > ``` > > change yourusername to your local machine user's username if you installed via the GUI installer. Change externalmtpoint/morefoldersifdesired to whatever directories on your external mount point you desire. > > Reboot and enjoy. In my Mac Mini4 ,i use this ``` setenv OLLAMA_HOME /Applications/Ollama.app setenv OLLAMA_MODELS /Volumes/Kingston/models ``` If anyone still encounters problems, you can go check the logs. The path is `/Users/yourusername/.ollama/logs/server.log ` At the bottom of the file, you will see the latest log. ``` 2025/01/07 21:05:16 routes.go:1259: INFO server config env="map[HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/Volumes/Kingston/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false http_proxy: https_proxy: no_proxy:]" ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2287