[GH-ISSUE #712] Where is the model file path on MacOS #26088

Closed
opened 2026-04-22 02:02:43 -05:00 by GiteaMirror · 14 comments
Owner

Originally created by @RoversX on GitHub (Oct 5, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/712

Hello, I would like to know where is the model path on Mac OS and how can I fully uninstall Ollama because I installed it in the wrong place.

Thanks

Originally created by @RoversX on GitHub (Oct 5, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/712 Hello, I would like to know where is the model path on Mac OS and how can I fully uninstall Ollama because I installed it in the wrong place. Thanks
Author
Owner

@RoversX commented on GitHub (Oct 5, 2023):

Oh it's here

~/.ollama/models

<!-- gh-comment-id:1749512238 --> @RoversX commented on GitHub (Oct 5, 2023): Oh it's here ~/.ollama/models
Author
Owner

@xyproto commented on GitHub (Oct 6, 2023):

Ideally, Ollama should store the cache in ~/Library/Caches/ollama on macOS, instead of in ~/.ollama.

<!-- gh-comment-id:1750218399 --> @xyproto commented on GitHub (Oct 6, 2023): Ideally, Ollama should store the cache in `~/Library/Caches/ollama` on macOS, instead of in `~/.ollama`.
Author
Owner

@Clivern commented on GitHub (Oct 6, 2023):

Anyway to copy and release these models somehow for a multi node setup. I can't find them locally!

Under ~/.ollama/models i see blobs and manifests

<!-- gh-comment-id:1751442327 --> @Clivern commented on GitHub (Oct 6, 2023): Anyway to copy and release these models somehow for a multi node setup. I can't find them locally! Under `~/.ollama/models` i see `blobs` and `manifests`
Author
Owner

@RoversX commented on GitHub (Oct 7, 2023):

Anyway to copy and release these models somehow for a multi node setup. I can't find them locally!

Under ~/.ollama/models i see blobs and manifests

If you check the folder size, it's under blobs

<!-- gh-comment-id:1751717000 --> @RoversX commented on GitHub (Oct 7, 2023): > Anyway to copy and release these models somehow for a multi node setup. I can't find them locally! > > Under `~/.ollama/models` i see `blobs` and `manifests` If you check the folder size, it's under blobs
Author
Owner

@justinmayer commented on GitHub (Oct 9, 2023):

Knowing where models are stored, as well as what other things will happen on first launch, would be so much easier to discover and understand if someone would merge the pull request I submitted back in August (#395).

It seems cruel to subject so many first-time users to this kind of confusion when the problem could be so easily solved by mashing the Merge button 😞

<!-- gh-comment-id:1752757105 --> @justinmayer commented on GitHub (Oct 9, 2023): Knowing where models are stored, as well as what other things will happen on first launch, would be **_so_** much easier to discover and understand if someone would merge the [pull request I submitted back in August](https://github.com/jmorganca/ollama/pull/395) (#395). It seems cruel to subject so many first-time users to this kind of confusion when the problem could be so easily solved by mashing the _Merge_ button 😞
Author
Owner

@xyproto commented on GitHub (Oct 9, 2023):

Also, when using ollama within a GitHub action, it would helpful to be able to cache models and only pull models if they are not already pulled. Having a file with the same name as the model (perhaps with ":" replaced with "_") would be nice.

<!-- gh-comment-id:1752923646 --> @xyproto commented on GitHub (Oct 9, 2023): Also, when using ollama within a GitHub action, it would helpful to be able to cache models and only pull models if they are not already pulled. Having a file with the same name as the model (perhaps with ":" replaced with "_") would be nice.
Author
Owner

@technovangelist commented on GitHub (Dec 4, 2023):

You can put models anywhere you like when you use the OLLAMA_MODELS environment variable which I think addresses the issue. I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.

<!-- gh-comment-id:1839347812 --> @technovangelist commented on GitHub (Dec 4, 2023): You can put models anywhere you like when you use the OLLAMA_MODELS environment variable which I think addresses the issue. I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.
Author
Owner

@brandoncarl commented on GitHub (Dec 9, 2023):

Thanks for the repository.

I am running v0.1.13 on macOS Sonoma. The OLLAMA_MODELS environment variable is having no impact.

$ echo $OLLAMA_MODELS
(prints appropriate directory)

$ ollama run <model>
(downloads to ~/.ollama/..)

$ OLLAMA_MODELS=<directory> run <model>
(downloads to ~/.ollama/...)
<!-- gh-comment-id:1848408896 --> @brandoncarl commented on GitHub (Dec 9, 2023): Thanks for the repository. I am running v0.1.13 on macOS Sonoma. The OLLAMA_MODELS environment variable is having no impact. ``` $ echo $OLLAMA_MODELS (prints appropriate directory) $ ollama run <model> (downloads to ~/.ollama/..) $ OLLAMA_MODELS=<directory> run <model> (downloads to ~/.ollama/...) ```
Author
Owner

@mtrin commented on GitHub (Dec 19, 2023):

same here - OLLAMA_MODELS has no effect on the folder for the Mac app @technovangelist

<!-- gh-comment-id:1862858434 --> @mtrin commented on GitHub (Dec 19, 2023): same here - OLLAMA_MODELS has no effect on the folder for the Mac app @technovangelist
Author
Owner

@xyproto commented on GitHub (Dec 19, 2023):

@brandoncarl @mtrin Is OLLAMA_MODELS set when executing ollama serve, or just when executing ollama run?

<!-- gh-comment-id:1863206584 --> @xyproto commented on GitHub (Dec 19, 2023): @brandoncarl @mtrin Is `OLLAMA_MODELS` set when executing `ollama serve`, or just when executing `ollama run`?
Author
Owner

@mtrin commented on GitHub (Dec 20, 2023):

seems like you have to quit the Mac app then run ollama serve with OLLAMA_MODELS set in the terminal which is like the linux setup not a mac "app" setup. from the documentation it didn't seem like ollama serve was a necessary step for mac.
once I did it, it worked

<!-- gh-comment-id:1864378994 --> @mtrin commented on GitHub (Dec 20, 2023): seems like you have to quit the Mac app then run ollama serve with OLLAMA_MODELS set in the terminal which is like the linux setup not a mac "app" setup. from the documentation it didn't seem like ollama serve was a necessary step for mac. once I did it, it worked
Author
Owner

@brandoncarl commented on GitHub (Dec 23, 2023):

The best workaround for this is to remove the environment variable and to instead create a symlink.

ln -s <target_path> ~/.ollama/models

<!-- gh-comment-id:1868358004 --> @brandoncarl commented on GitHub (Dec 23, 2023): The best workaround for this is to remove the environment variable and to instead create a symlink. `ln -s <target_path> ~/.ollama/models`
Author
Owner

@wyy511511 commented on GitHub (Jul 11, 2024):

I am running v0.1.13 on macOS Sonoma. The OLLAMA_MODELS environment variable is having no impact.

quit ollama app on top right tray
vim ~/.zshrc
export OLLAMA_MODELS="{placeholder for your path}"
source ~/.zshrc

It works for me.

<!-- gh-comment-id:2222206955 --> @wyy511511 commented on GitHub (Jul 11, 2024): > I am running v0.1.13 on macOS Sonoma. The OLLAMA_MODELS environment variable is having no impact. quit ollama app on top right tray vim ~/.zshrc export OLLAMA_MODELS="{placeholder for your path}" source ~/.zshrc It works for me.
Author
Owner

@unikitty37 commented on GitHub (Aug 11, 2025):

There seems to be a setting for this in Settings… now — but the README still says to use the environment variable.

@technovangelist Is the environment variable still the best way to do this, or does the README need updating to reflect the UI change? (It's still worth keeping information on launchctl setenv around, of course, since there isn't UI for other variables…)

<!-- gh-comment-id:3174861918 --> @unikitty37 commented on GitHub (Aug 11, 2025): There seems to be a setting for this in `Settings…` now — but the README still says to use the environment variable. @technovangelist Is the environment variable still the best way to do this, or does the README need updating to reflect the UI change? (It's still worth keeping information on `launchctl setenv` around, of course, since there isn't UI for other variables…)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26088