[GH-ISSUE #4749] OLLAMA_MODELS not applied on initial start or on restart after upgrade on macOS #2992

Closed
opened 2026-04-12 13:23:12 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @vernonstinebaker on GitHub (May 31, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4749

The addition of OLLAMA_MODELS is much appreciated, since it allows specifying a different location, such as an external disk, where more space might be available.

One issue, however, is if we put specify OLLAMA_MODELS in our .zshrc, for example, the .zshrc file isn't read when Ollama starts initially or when Ollama restarts after an update.

Perhaps I'm missing something?

Otherwise, it would be great to make this something that can be configured/set directly in Ollama instead of needing to quit Ollama, open a Terminal (so that .zshrc is read) and start Ollama from the Terminal so that the OLLAMA_MODELS directory is set instead of using the default.

Originally created by @vernonstinebaker on GitHub (May 31, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4749 The addition of OLLAMA_MODELS is much appreciated, since it allows specifying a different location, such as an external disk, where more space might be available. One issue, however, is if we put specify OLLAMA_MODELS in our .zshrc, for example, the .zshrc file isn't read when Ollama starts initially or when Ollama restarts after an update. Perhaps I'm missing something? Otherwise, it would be great to make this something that can be configured/set directly in Ollama instead of needing to quit Ollama, open a Terminal (so that .zshrc is read) and start Ollama from the Terminal so that the OLLAMA_MODELS directory is set instead of using the default.
GiteaMirror added the feature request label 2026-04-12 13:23:12 -05:00
Author
Owner

@thinkverse commented on GitHub (May 31, 2024):

.zshrc is terminal, you might have better luck using launchctl that interacts directly with launchd which is the service MacOS uses for launching processes and more. It is also what Ollama recommends using in their documentation for setting environment variables on Mac.

launchctl setenv OLLAMA_MODELS <directory>
launchctl getenv OLLAMA_MODELS
launchctl unsetenv OLLAMA_MODELS
<!-- gh-comment-id:2142476304 --> @thinkverse commented on GitHub (May 31, 2024): `.zshrc` is terminal, you might have better luck using `launchctl` that interacts directly with `launchd` which is the service MacOS uses for launching processes and more. It is also what Ollama recommends using in their documentation for [setting environment variables on Mac](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-mac). ``` launchctl setenv OLLAMA_MODELS <directory> launchctl getenv OLLAMA_MODELS launchctl unsetenv OLLAMA_MODELS ```
Author
Owner

@vernonstinebaker commented on GitHub (May 31, 2024):

Perfect. Thanks for the pointer.

Sent from my iPhone

On May 31, 2024, at 23:12, Kim Hallberg @.***> wrote:



.zshrc is terminal, you might have better luck using launchctl that interacts directly with launchd which is the service MacOS uses for launching processes and more. It is also what Ollama recommends using in their documentation for setting environment variables on Machttps://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-mac.

launchctl setenv OLLAMA_MODELS
launchctl getenv OLLAMA_MODELS
launchctl unsetenv OLLAMA_MODELS


Reply to this email directly, view it on GitHubhttps://github.com/ollama/ollama/issues/4749#issuecomment-2142476304, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AB7LSOSP4WVZ3ORVQHDFIODZFCHNJAVCNFSM6AAAAABISR3ORCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNBSGQ3TMMZQGQ.
You are receiving this because you authored the thread.Message ID: @.***>

<!-- gh-comment-id:2142634539 --> @vernonstinebaker commented on GitHub (May 31, 2024): Perfect. Thanks for the pointer. Sent from my iPhone On May 31, 2024, at 23:12, Kim Hallberg ***@***.***> wrote:  .zshrc is terminal, you might have better luck using launchctl that interacts directly with launchd which is the service MacOS uses for launching processes and more. It is also what Ollama recommends using in their documentation for setting environment variables on Mac<https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-mac>. launchctl setenv OLLAMA_MODELS <directory> launchctl getenv OLLAMA_MODELS launchctl unsetenv OLLAMA_MODELS — Reply to this email directly, view it on GitHub<https://github.com/ollama/ollama/issues/4749#issuecomment-2142476304>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AB7LSOSP4WVZ3ORVQHDFIODZFCHNJAVCNFSM6AAAAABISR3ORCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNBSGQ3TMMZQGQ>. You are receiving this because you authored the thread.Message ID: ***@***.***>
Author
Owner

@tomasznazarenko commented on GitHub (Jul 26, 2024):

For some reason after an update Ollama started to ignore the environment variable set using launchctl.

You may try to create a symbolic link from ~/.ollama/models to your <target_path>.

ln -s <target_path> ~/.ollama/models

This should made ollama models directory to be actually located at the specified <target_path>

<!-- gh-comment-id:2251836987 --> @tomasznazarenko commented on GitHub (Jul 26, 2024): For some reason after an update Ollama started to ignore the environment variable set using `launchctl`. You may try to create a symbolic link from `~/.ollama/models` to your `<target_path>`. ``` ln -s <target_path> ~/.ollama/models ``` This should made ollama models directory to be actually located at the specified `<target_path>`
Author
Owner

@vernonstinebaker commented on GitHub (Jul 26, 2024):

@tomasznazarenko Are you running the Sequoia beta?

launchctl was working as expected on Sonoma, but in Sequoia the model path set with launchctl doesn't seem to be respected at boot. If you quit ollama and restart it from the command line, the launchctl variable seems to be recognized.

<!-- gh-comment-id:2251883821 --> @vernonstinebaker commented on GitHub (Jul 26, 2024): @tomasznazarenko Are you running the Sequoia beta? launchctl was working as expected on Sonoma, but in Sequoia the model path set with launchctl doesn't seem to be respected at boot. If you quit ollama and restart it from the command line, the launchctl variable seems to be recognized.
Author
Owner

@dhiltgen commented on GitHub (Jul 1, 2025):

I think we can close this one now that 0.9.4 has a settings UI to allow easy changing of the model location which persists.

Image
<!-- gh-comment-id:3025857124 --> @dhiltgen commented on GitHub (Jul 1, 2025): I think we can close this one now that 0.9.4 has a settings UI to allow easy changing of the model location which persists. <img width="658" alt="Image" src="https://github.com/user-attachments/assets/f5e34a34-d1d9-4321-80b4-6deb4fc2ec51" />
Author
Owner

@scottrbaxter commented on GitHub (Jul 4, 2025):

I realize that this was already closed, but @dhiltgen could you please confirm that this UI setting persists on reboot without requiring login? My concern is that this service will still require user auth after restart, before the service/UI actually runs.

The only viable solution so far requires manually creating a LaunchDaemon which sets OLLAMA_HOST. It'd be much more ideal for something like referencing a config file over UI config or requiring each individual user to manually create a custom LaunchDaemon. Perhaps the UI already does this, but I'd like to confirm that the service can run, non-interactively, if macOS restarts.

<!-- gh-comment-id:3037240554 --> @scottrbaxter commented on GitHub (Jul 4, 2025): I realize that this was already closed, but @dhiltgen could you please confirm that this UI setting persists on reboot without requiring login? My concern is that this service will still require user auth after restart, before the service/UI actually runs. The only _viable_ solution so far requires manually creating a LaunchDaemon which sets `OLLAMA_HOST`. It'd be much more ideal for something like referencing a config file over UI config or requiring each individual user to manually create a custom LaunchDaemon. Perhaps the UI already does this, but I'd like to confirm that the service can run, non-interactively, if macOS restarts.
Author
Owner

@dhiltgen commented on GitHub (Jul 6, 2025):

The settings persist across reboots, but the desktop app lifecycle is generally tied to user login. I'm not sure if your use-case includes multi-user, but bear in mind the settings are per-user. When you attempt to use the CLI before the desktop app has started (or after quiting, or if you disable start on login), it will auto-launch the desktop app. If there isn't an active GUI login session for that user, this will fail, so the CLI would fail to connect to the server. If your intent is to have it run on a Mac fully headless, I'd recommend wiring up your own service management via luanchd and the ollama-darwin.tgz bundle we publish on every build, then you can control everything to match your needs. You can have it start at boot with a service account, and shared model storage regardless of which user tries to use it. The desktop app is optimized for users on the graphical console.

<!-- gh-comment-id:3042172354 --> @dhiltgen commented on GitHub (Jul 6, 2025): The settings persist across reboots, but the desktop app lifecycle is generally tied to user login. I'm not sure if your use-case includes multi-user, but bear in mind the settings are per-user. When you attempt to use the CLI before the desktop app has started (or after quiting, or if you disable start on login), it will auto-launch the desktop app. If there isn't an active GUI login session for that user, this will fail, so the CLI would fail to connect to the server. If your intent is to have it run on a Mac fully headless, I'd recommend wiring up your own service management via luanchd and the ollama-darwin.tgz bundle we publish on every build, then you can control everything to match your needs. You can have it start at boot with a service account, and shared model storage regardless of which user tries to use it. The desktop app is optimized for users on the graphical console.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2992