[GH-ISSUE #733] where is everything? #46852

Closed
opened 2026-04-28 00:52:47 -05:00 by GiteaMirror · 27 comments
Owner

Originally created by @iplayfast on GitHub (Oct 8, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/733

I don't use Docker so maybe there are obvious answers that I don't know.
I've downloaded the install from the website and it put it in the /usr/local/bin directory. Not my first choice. For testing software I want to put it in a user directory.
It ran find and pulled the mistrel models. Only thing is, I've already got them downloaded. I could not tell it to use my downloads, and
I have no idea where it downloaded to. So now I've got wasted space on my limited hard drive.

I then cloned the repo and build it. it build it fine, but I can't actually find what it built.
It says " 58%] Building CXX object common/CMakeFiles/common.dir/common.cpp.o
[ 75%] Built target common
[ 83%] Built target BUILD_INFO
[ 91%] Building CXX object examples/server/CMakeFiles/server.dir/server.cpp.o
[100%] Linking CXX executable ../../bin/server
[100%] Built target server
"
but there isn't any server file in my bin directory. I really hate not knowing where things are going.

Originally created by @iplayfast on GitHub (Oct 8, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/733 I don't use Docker so maybe there are obvious answers that I don't know. I've downloaded the install from the website and it put it in the /usr/local/bin directory. Not my first choice. For testing software I want to put it in a user directory. It ran find and pulled the mistrel models. Only thing is, I've already got them downloaded. I could not tell it to use my downloads, and I have no idea where it downloaded to. So now I've got wasted space on my limited hard drive. I then cloned the repo and build it. it build it fine, but I can't actually find what it built. It says " 58%] Building CXX object common/CMakeFiles/common.dir/common.cpp.o [ 75%] Built target common [ 83%] Built target BUILD_INFO [ 91%] Building CXX object examples/server/CMakeFiles/server.dir/server.cpp.o [100%] Linking CXX executable ../../bin/server [100%] Built target server " but there isn't any server file in my bin directory. I really hate not knowing where things are going.
Author
Owner

@ndsteve commented on GitHub (Oct 8, 2023):

It looks like the models are in /usr/share/ollama/.ollama and they're files that start with a dot so you'd use ls -al to see them listed

<!-- gh-comment-id:1752195678 --> @ndsteve commented on GitHub (Oct 8, 2023): It looks like the models are in /usr/share/ollama/.ollama and they're files that start with a dot so you'd use ls -al to see them listed
Author
Owner

@justinmayer commented on GitHub (Oct 9, 2023):

Knowing where models are stored, as well as what other things will happen on first launch, would be so much easier to discover and understand if someone would merge the pull request I submitted back in August (#395).

It seems cruel to subject so many first-time users to this kind of confusion when the problem could be so easily solved by mashing the Merge button 😞

<!-- gh-comment-id:1752754490 --> @justinmayer commented on GitHub (Oct 9, 2023): Knowing where models are stored, as well as what other things will happen on first launch, would be **_so_** much easier to discover and understand if someone would merge the [pull request I submitted back in August](https://github.com/jmorganca/ollama/pull/395) (#395). It seems cruel to subject so many first-time users to this kind of confusion when the problem could be so easily solved by mashing the _Merge_ button 😞
Author
Owner

@bilogic commented on GitHub (Oct 27, 2023):

@ndsteve

Thought I want to clear up that it is /usr/share/ollama/ only when following the steps from https://github.com/jmorganca/ollama/blob/main/docs/linux.md#adding-ollama-as-a-startup-service-recommended

I'm on ubuntu and data/models are stored in the .ollama folder under the home folder of the user that runs the ollama server, i.e. ~/.ollama.

If ubuntu users still can't find it, install and use updatedb then locate ollama to find the files.

<!-- gh-comment-id:1782271150 --> @bilogic commented on GitHub (Oct 27, 2023): @ndsteve Thought I want to clear up that it is `/usr/share/ollama/` only when following the steps from https://github.com/jmorganca/ollama/blob/main/docs/linux.md#adding-ollama-as-a-startup-service-recommended I'm on ubuntu and data/models are stored in the `.ollama` folder under the home folder of the user that runs the ollama server, i.e. `~/.ollama`. If ubuntu users still can't find it, install and use `updatedb` then `locate ollama` to find the files.
Author
Owner

@technovangelist commented on GitHub (Dec 4, 2023):

I thinks https://github.com/jmorganca/ollama/blob/main/docs/faq.md#where-are-models-stored should answer all your questions. I will go ahead and close the issue now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.

<!-- gh-comment-id:1839426347 --> @technovangelist commented on GitHub (Dec 4, 2023): I thinks https://github.com/jmorganca/ollama/blob/main/docs/faq.md#where-are-models-stored should answer all your questions. I will go ahead and close the issue now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.
Author
Owner

@Joshua2point0 commented on GitHub (Jan 23, 2024):

I don't use Docker so maybe there are obvious answers that I don't know. I've downloaded the install from the website and it put it in the /usr/local/bin directory. Not my first choice. For testing software I want to put it in a user directory. It ran find and pulled the mistrel models. Only thing is, I've already got them downloaded. I could not tell it to use my downloads, and I have no idea where it downloaded to. So now I've got wasted space on my limited hard drive.

I then cloned the repo and build it. it build it fine, but I can't actually find what it built. It says " 58%] Building CXX object common/CMakeFiles/common.dir/common.cpp.o [ 75%] Built target common [ 83%] Built target BUILD_INFO [ 91%] Building CXX object examples/server/CMakeFiles/server.dir/server.cpp.o [100%] Linking CXX executable ../../bin/server [100%] Built target server " but there isn't any server file in my bin directory. I really hate not knowing where things are going.

They are ultimately in the library folder at the end of the directory structure:
/usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/library/""

<!-- gh-comment-id:1906869280 --> @Joshua2point0 commented on GitHub (Jan 23, 2024): > I don't use Docker so maybe there are obvious answers that I don't know. I've downloaded the install from the website and it put it in the /usr/local/bin directory. Not my first choice. For testing software I want to put it in a user directory. It ran find and pulled the mistrel models. Only thing is, I've already got them downloaded. I could not tell it to use my downloads, and I have no idea where it downloaded to. So now I've got wasted space on my limited hard drive. > > I then cloned the repo and build it. it build it fine, but I can't actually find what it built. It says " 58%] Building CXX object common/CMakeFiles/common.dir/common.cpp.o [ 75%] Built target common [ 83%] Built target BUILD_INFO [ 91%] Building CXX object examples/server/CMakeFiles/server.dir/server.cpp.o [100%] Linking CXX executable ../../bin/server [100%] Built target server " but there isn't any server file in my bin directory. I really hate not knowing where things are going. They are ultimately in the library folder at the end of the directory structure: /usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/library/""
Author
Owner

@thephimart commented on GitHub (Jan 24, 2024):

So I decided to move from WSL2 install to Docker on WSL2 install, i finally found where the models are stored for docker:
/var/lib/docker/volumes/ollama/_data#/models

You will need to sudo -s into root to go there

Maybe https://github.com/jmorganca/ollama/blob/main/docs/faq.md#where-are-models-stored could be updated?

<!-- gh-comment-id:1907716229 --> @thephimart commented on GitHub (Jan 24, 2024): So I decided to move from WSL2 install to Docker on WSL2 install, i finally found where the models are stored for docker: /var/lib/docker/volumes/ollama/_data#/models You will need to sudo -s into root to go there Maybe https://github.com/jmorganca/ollama/blob/main/docs/faq.md#where-are-models-stored could be updated?
Author
Owner

@sbharat0147 commented on GitHub (Jan 30, 2024):

I have setup ollama in RHEL 8.6 for offline use. In usr/share/ollama/.ollama/ folder I have copied all models file. Although when I am running ollama server models are not listed in api/tags and in log I am getting message "...llama2 from registry file is getting skipped.
I am unable to list models anyho, even not able to upload new gguf file.

<!-- gh-comment-id:1916233880 --> @sbharat0147 commented on GitHub (Jan 30, 2024): I have setup ollama in RHEL 8.6 for offline use. In usr/share/ollama/.ollama/ folder I have copied all models file. Although when I am running ollama server models are not listed in api/tags and in log I am getting message "...llama2 from registry file is getting skipped. I am unable to list models anyho, even not able to upload new gguf file.
Author
Owner

@dekarpaulvictor commented on GitHub (Feb 2, 2024):

Ollama's official install script creates a user called 'ollama' in your system and sets their user home directory in /usr/share/ollama. Just as your own user directory would normally be under /home/yourname and you'd find the hidden .ollama directory in your home directory, so the .ollama directory is now under /usr/share/ollama. Here is the relevant section of the install script for your reference (the options -m -d instruct the useradd command to create the user home directory in the specified location):

$SUDO useradd -r -s /bin/false -m -d /usr/share/ollama ollama

<!-- gh-comment-id:1922941296 --> @dekarpaulvictor commented on GitHub (Feb 2, 2024): Ollama's official install script creates a user called 'ollama' in your system and sets their user home directory in `/usr/share/ollama`. Just as your own user directory would normally be under `/home/yourname` and you'd find the hidden `.ollama` directory in your home directory, so the `.ollama` directory is now under `/usr/share/ollama`. Here is the relevant section of the install script for your reference (the options `-m -d` instruct the `useradd` command to create the user home directory in the specified location): `$SUDO useradd -r -s /bin/false -m -d /usr/share/ollama ollama`
Author
Owner

@TheDevelolper commented on GitHub (Apr 7, 2024):

I installed Ollama using snap on Ubuntu. But the folder for the models doesn't seem to be there. I'm not sure perhaps I'm missing it but I did try ls -la to make sure it wasn't due to hidden files.

<!-- gh-comment-id:2041504053 --> @TheDevelolper commented on GitHub (Apr 7, 2024): I installed Ollama using snap on Ubuntu. But the folder for the models doesn't seem to be there. I'm not sure perhaps I'm missing it but I did try ls -la to make sure it wasn't due to hidden files.
Author
Owner

@richardstevenhack commented on GitHub (Jun 26, 2024):

Doesn't exist on openSUSE Tumbleweed using the openSUSE rpm. This is why I couldn't use the default install, because that standard directory does not exist for no known reason on openSUSE, so the install fails.

On my system the models are supposedly stored in: /home/user/.ollama/models.

Except I just downloaded Qwen2 and it doesn't show as being in that directory. And yes, I have Show Hidden Files enabled.

So now I'm searching for the damn thing again. It's running, so obviously it has to be somewhere in 56TB of data...

Found it. Went to Yast and looked at the install record. Models on openSUSE Tumbleweed are stored in:
/var/lib/ollama/.ollama/models/. For no known reason, of course, courtesy of whoever implemented the openSUSE package.

Which is why I hate programmers - consistency is not their strong suit.

<!-- gh-comment-id:2190580639 --> @richardstevenhack commented on GitHub (Jun 26, 2024): Doesn't exist on openSUSE Tumbleweed using the openSUSE rpm. This is why I couldn't use the default install, because that standard directory does not exist for no known reason on openSUSE, so the install fails. On my system the models are supposedly stored in: /home/user/.ollama/models. Except I just downloaded Qwen2 and it doesn't show as being in that directory. And yes, I have Show Hidden Files enabled. So now I'm searching for the damn thing again. It's running, so obviously it has to be somewhere in 56TB of data... Found it. Went to Yast and looked at the install record. Models on openSUSE Tumbleweed are stored in: /var/lib/ollama/.ollama/models/. For no known reason, of course, courtesy of whoever implemented the openSUSE package. Which is why I hate programmers - consistency is not their strong suit.
Author
Owner

@BrainSlugs83 commented on GitHub (Jun 28, 2024):

For anyone who stumbles here trying to figure out where this tool stores it's models (because it's literally not documented anywhere as far as I can tell, and this is like the top google search result):

  • On Linux it sounds like this should be located at ~/.ollama/models
  • On Windows this is in your user profile directory, %USERPROFILE%\.ollama\models
    (e.g. C:\Users\<your-user-alias>\.ollama\models)

Unfortunately from there, they're stored in some kind of blob format with hash based filenames, making it completely unmanagable from the file system perspective, which is a pretty cool feature. 🙄

<!-- gh-comment-id:2195903433 --> @BrainSlugs83 commented on GitHub (Jun 28, 2024): For anyone who stumbles here trying to figure out where this tool stores it's models (because it's literally not documented anywhere as far as I can tell, and this is like the top google search result): - On Linux it sounds like this should be located at `~/.ollama/models` - On Windows this is in your user profile directory, `%USERPROFILE%\.ollama\models` (e.g. `C:\Users\<your-user-alias>\.ollama\models`) Unfortunately from there, they're stored in some kind of blob format with hash based filenames, making it completely unmanagable from the file system perspective, which is a pretty cool feature. 🙄
Author
Owner

@richardstevenhack commented on GitHub (Jun 28, 2024):

Again, where the models are stored seems to be distro-dependent. The default install on Linux is /usr/share/ollama, but that doesn't work on openSUSE Tumbleweed.

The Ollama documentation says this:
`Where are models stored?

macOS: ~/.ollama/models
Linux: /usr/share/ollama/.ollama/models
Windows: C:\Users\%username%\.ollama\models

`
But openSUSE Tumbleweed decided to store it in /var/lib/ollama/.ollama/models/ - as I said above, for no known reason.

Why programmers are obsessed with "Not Invented Here" and have to change everything is beyond me. Too much damn ego, I suspect.

<!-- gh-comment-id:2195944833 --> @richardstevenhack commented on GitHub (Jun 28, 2024): Again, where the models are stored seems to be distro-dependent. The default install on Linux is /usr/share/ollama, but that doesn't work on openSUSE Tumbleweed. The Ollama documentation says this: `Where are models stored? macOS: ~/.ollama/models Linux: /usr/share/ollama/.ollama/models Windows: C:\Users\%username%\.ollama\models ` But openSUSE Tumbleweed decided to store it in /var/lib/ollama/.ollama/models/ - as I said above, for no known reason. Why programmers are obsessed with "Not Invented Here" and have to change everything is beyond me. Too much damn ego, I suspect.
Author
Owner

@bilogic commented on GitHub (Jun 28, 2024):

The setup examples used /usr/share/ollama so that every user does not start having their own identical 5-10GB copy of the model but can instead share 1 single copy on the OS and save on storage.

But if your organization policy does not allow file sharing, then the remaining alternative is to store it in your HOME folder.

` But openSUSE Tumbleweed decided to store it in /var/lib/ollama/.ollama/models/ - as I said above, for no known reason.

The ~ in front of ~/.ollama/models will be replaced with the HOME folder of the running user, this is essential knowledge especially on unix-esqe OSes given that this is a CLI tool and the reason is not unknown.

<!-- gh-comment-id:2195966728 --> @bilogic commented on GitHub (Jun 28, 2024): The setup examples used `/usr/share/ollama` so that every user does not start having their own identical 5-10GB copy of the model but can instead share 1 single copy on the OS and save on storage. But if your organization policy does not allow file sharing, then the remaining alternative is to store it in your `HOME` folder. > ` But openSUSE Tumbleweed decided to store it in /var/lib/ollama/.ollama/models/ - as I said above, for no known reason. The `~` in front of `~/.ollama/models` will be replaced with the `HOME` folder of the running user, this is essential knowledge especially on unix-esqe OSes given that this is a CLI tool and the reason is not unknown.
Author
Owner

@richardstevenhack commented on GitHub (Jul 16, 2024):

I know what ` means... The point is that all this is stored in a system directory on a path with root ownership of the intervening directories. Programs such as MSTY can not download Ollama models to the Ollama models directory because they don't have permission. The Ollama service doesn't have that problem.

The proper solution is to ask on install if the program is to be shared with multiple users or a single user, and install the program and models directories accord to the response. Better yet - ASK the end user where everything is to be installed, program and models separately, and instruct the ollama server appropriately, either with environment variables (a clumsy method) or better yet an actual config file stored in the user's home directory under `/.config/ollama (or in /usr/share.

<!-- gh-comment-id:2230831637 --> @richardstevenhack commented on GitHub (Jul 16, 2024): I know what ` means... The point is that all this is stored in a system directory on a path with root ownership of the intervening directories. Programs such as MSTY can not download Ollama models to the Ollama models directory because they don't have permission. The Ollama service doesn't have that problem. The proper solution is to ask on install if the program is to be shared with multiple users or a single user, and install the program and models directories accord to the response. Better yet - ASK the end user where everything is to be installed, program and models separately, and instruct the ollama server appropriately, either with environment variables (a clumsy method) or better yet an actual config file stored in the user's home directory under `/.config/ollama (or in /usr/share.
Author
Owner

@st3w4r commented on GitHub (Aug 2, 2024):

For a deeper understanding the code here: https://github.com/ollama/ollama/tree/main/app/store.

The function getStorePath is specific to each system.

<!-- gh-comment-id:2264885564 --> @st3w4r commented on GitHub (Aug 2, 2024): For a deeper understanding the code here: https://github.com/ollama/ollama/tree/main/app/store. The function `getStorePath` is specific to each system.
Author
Owner

@wnm3 commented on GitHub (Nov 21, 2024):

Not sure why the /usr/share/ollama directory is only accessible by the ollama user, but for people trying to figure out which model file is what in the /usr/share/ollama/.ollama/models directory, use the content of the /usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/library/ directory (e.g,. for granite3 dense content use /usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/library/granite3-dense files (8b or 2b in my case)) to find the digest element to get the sha256 value to match against the filenames...

Does it break something if we chmod so other users can gain access to these files?

<!-- gh-comment-id:2491399589 --> @wnm3 commented on GitHub (Nov 21, 2024): Not sure why the /usr/share/ollama directory is only accessible by the ollama user, but for people trying to figure out which model file is what in the /usr/share/ollama/.ollama/models directory, use the content of the /usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/library/ directory (e.g,. for granite3 dense content use /usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/library/granite3-dense files (8b or 2b in my case)) to find the digest element to get the sha256 value to match against the filenames... Does it break something if we chmod so other users can gain access to these files?
Author
Owner

@kudorgyozo commented on GitHub (Dec 7, 2024):

There has to be a better way than to search for "ollama install folder" and finding this issue as the first result.

<!-- gh-comment-id:2525251721 --> @kudorgyozo commented on GitHub (Dec 7, 2024): There has to be a better way than to search for "ollama install folder" and finding this issue as the first result.
Author
Owner

@Kreijstal commented on GitHub (Feb 22, 2025):

Again, where the models are stored seems to be distro-dependent. The default install on Linux is /usr/share/ollama, but that doesn't work on openSUSE Tumbleweed.

The Ollama documentation says this: `Where are models stored?

macOS: ~/.ollama/models
Linux: /usr/share/ollama/.ollama/models
Windows: C:\Users\%username%\.ollama\models

` But openSUSE Tumbleweed decided to store it in /var/lib/ollama/.ollama/models/ - as I said above, for no known reason.

Why programmers are obsessed with "Not Invented Here" and have to change everything is beyond me. Too much damn ego, I suspect.

maybe /usr/share is for static files and /var/lib is for things software downloads later?

<!-- gh-comment-id:2676150842 --> @Kreijstal commented on GitHub (Feb 22, 2025): > Again, where the models are stored seems to be distro-dependent. The default install on Linux is /usr/share/ollama, but that doesn't work on openSUSE Tumbleweed. > > The Ollama documentation says this: `Where are models stored? > > ``` > macOS: ~/.ollama/models > Linux: /usr/share/ollama/.ollama/models > Windows: C:\Users\%username%\.ollama\models > ``` > > ` But openSUSE Tumbleweed decided to store it in /var/lib/ollama/.ollama/models/ - as I said above, for no known reason. > > Why programmers are obsessed with "Not Invented Here" and have to change everything is beyond me. Too much damn ego, I suspect. maybe /usr/share is for static files and /var/lib is for things software downloads later?
Author
Owner

@richardstevenhack commented on GitHub (Feb 22, 2025):

My final solution was simple: Since MSTY has an embedded Ollama server and stores models downloaded to it in the /home//.config/Msty/models directory, I just stopped using Ollama and use MSTY as the Ollama server. (This could, I suppose, be an issue if your /home directory is short on space.) Any other program can usually be pointed to the embedded Ollama server which runs on http://localhost:10000/ .

Why futz with a command line Ollama when you can have a nice GUI with many advanced AI interactive features? It even enables RAG with an entire Obsidian Vault of notes. Multiple models at once, split chats simultaneously. Use models from Ollama or HuggingFace. Terrific.

<!-- gh-comment-id:2676396428 --> @richardstevenhack commented on GitHub (Feb 22, 2025): My final solution was simple: Since MSTY has an embedded Ollama server and stores models downloaded to it in the /home/<USER>/.config/Msty/models directory, I just stopped using Ollama and use MSTY as the Ollama server. (This could, I suppose, be an issue if your /home directory is short on space.) Any other program can usually be pointed to the embedded Ollama server which runs on http://localhost:10000/ . Why futz with a command line Ollama when you can have a nice GUI with many advanced AI interactive features? It even enables RAG with an entire Obsidian Vault of notes. Multiple models at once, split chats simultaneously. Use models from Ollama or HuggingFace. Terrific.
Author
Owner

@iplayfast commented on GitHub (Feb 26, 2025):

Msty is over priced. (ie it's not opensource). If you want to go the browser route, then check out openwebui.
https://github.com/open-webui/open-webui

<!-- gh-comment-id:2685584775 --> @iplayfast commented on GitHub (Feb 26, 2025): Msty is over priced. (ie it's not opensource). If you want to go the browser route, then check out openwebui. [https://github.com/open-webui/open-webui](url)
Author
Owner

@richardstevenhack commented on GitHub (Feb 26, 2025):

It's free to me and anyone else who doesn't want the extra perks they're charging for. :-)

And I don't like the browser route. Or the docker route. That's why I like the AppImage they provide.

<!-- gh-comment-id:2686241160 --> @richardstevenhack commented on GitHub (Feb 26, 2025): It's free to me and anyone else who doesn't want the extra perks they're charging for. :-) And I don't like the browser route. Or the docker route. That's why I like the AppImage they provide.
Author
Owner

@ifounda-bug commented on GitHub (May 1, 2025):

Again, where the models are stored seems to be distro-dependent. The default install on Linux is /usr/share/ollama, but that doesn't work on openSUSE Tumbleweed.

The Ollama documentation says this: `Where are models stored?

macOS: ~/.ollama/models
Linux: /usr/share/ollama/.ollama/models
Windows: C:\Users\%username%\.ollama\models

` But openSUSE Tumbleweed decided to store it in /var/lib/ollama/.ollama/models/ - as I said above, for no known reason.

Why programmers are obsessed with "Not Invented Here" and have to change everything is beyond me. Too much damn ego, I suspect.

There seems to be a good reason why opensuse has done that. I have been running a manual install of ollama, and over time I have found disk space to mysteriously disappear. Uninstalling models didn't free nearly as much space as it should have. I have tracked the issue down to the fact that /usr/share/ is within snappers default snapshotting locations. In other words, ollama's terrible decision to install models in /usr/share has now caused me a significant problem in freeing disk space, which requires me to delete every last snapshot I have of my system, which itself is non-trivial.

<!-- gh-comment-id:2843954534 --> @ifounda-bug commented on GitHub (May 1, 2025): > Again, where the models are stored seems to be distro-dependent. The default install on Linux is /usr/share/ollama, but that doesn't work on openSUSE Tumbleweed. > > The Ollama documentation says this: `Where are models stored? > > ``` > macOS: ~/.ollama/models > Linux: /usr/share/ollama/.ollama/models > Windows: C:\Users\%username%\.ollama\models > ``` > > ` But openSUSE Tumbleweed decided to store it in /var/lib/ollama/.ollama/models/ - as I said above, for no known reason. > > Why programmers are obsessed with "Not Invented Here" and have to change everything is beyond me. Too much damn ego, I suspect. There seems to be a good reason why opensuse has done that. I have been running a manual install of ollama, and over time I have found disk space to mysteriously disappear. Uninstalling models didn't free nearly as much space as it should have. I have tracked the issue down to the fact that /usr/share/ is within snappers default snapshotting locations. In other words, ollama's terrible decision to install models in /usr/share has now caused me a significant problem in freeing disk space, which requires me to delete every last snapshot I have of my system, which itself is non-trivial.
Author
Owner

@Kreijstal commented on GitHub (May 1, 2025):

Again, where the models are stored seems to be distro-dependent. The default install on Linux is /usr/share/ollama, but that doesn't work on openSUSE Tumbleweed.
The Ollama documentation says this: `Where are models stored?

macOS: ~/.ollama/models
Linux: /usr/share/ollama/.ollama/models
Windows: C:\Users\%username%\.ollama\models

` But openSUSE Tumbleweed decided to store it in /var/lib/ollama/.ollama/models/ - as I said above, for no known reason.
Why programmers are obsessed with "Not Invented Here" and have to change everything is beyond me. Too much damn ego, I suspect.

There seems to be a good reason why opensuse has done that. I have been running a manual install of ollama, and over time I have found disk space to mysteriously disappear. Uninstalling models didn't free nearly as much space as it should have. I have tracked the issue down to the fact that /usr/share/ is within snappers default snapshotting locations. In other words, ollama's terrible decision to install models in /usr/share has now caused me a significant problem in freeing disk space, which requires me to delete every last snapshot I have of my system, which itself is non-trivial.

that is a very good reason, but why didn't you use .local/share

<!-- gh-comment-id:2844173504 --> @Kreijstal commented on GitHub (May 1, 2025): > > Again, where the models are stored seems to be distro-dependent. The default install on Linux is /usr/share/ollama, but that doesn't work on openSUSE Tumbleweed. > > The Ollama documentation says this: `Where are models stored? > > ``` > > macOS: ~/.ollama/models > > Linux: /usr/share/ollama/.ollama/models > > Windows: C:\Users\%username%\.ollama\models > > ``` > > > > > > > > > > > > > > > > > > > > > > > > ` But openSUSE Tumbleweed decided to store it in /var/lib/ollama/.ollama/models/ - as I said above, for no known reason. > > Why programmers are obsessed with "Not Invented Here" and have to change everything is beyond me. Too much damn ego, I suspect. > > There seems to be a good reason why opensuse has done that. I have been running a manual install of ollama, and over time I have found disk space to mysteriously disappear. Uninstalling models didn't free nearly as much space as it should have. I have tracked the issue down to the fact that /usr/share/ is within snappers default snapshotting locations. In other words, ollama's terrible decision to install models in /usr/share has now caused me a significant problem in freeing disk space, which requires me to delete every last snapshot I have of my system, which itself is non-trivial. that is a very good reason, but why didn't you use .local/share
Author
Owner

@richardstevenhack commented on GitHub (May 1, 2025):

Interesting. I didn't think about the impact on BTRFS snapshots. Currently, I'm using MSTY, the GUI front end with an embedded Ollama server, so I don't need to pull models from Ollama anymore. However, MSTY stores its models in /home//config/Msty/models, which is included in BTRFS snapshots if the /home directory is not on another partition. Fortunately my root and home directories are on a 1TB NVME SSD, and I don't change models that often, so hopefully it won't become an issue. I don't have any other user data in /home, everything else is on other hard drives.

<!-- gh-comment-id:2844178759 --> @richardstevenhack commented on GitHub (May 1, 2025): Interesting. I didn't think about the impact on BTRFS snapshots. Currently, I'm using MSTY, the GUI front end with an embedded Ollama server, so I don't need to pull models from Ollama anymore. However, MSTY stores its models in /home/<user>/config/Msty/models, which is included in BTRFS snapshots if the /home directory is not on another partition. Fortunately my root and home directories are on a 1TB NVME SSD, and I don't change models that often, so hopefully it won't become an issue. I don't have any other user data in /home, everything else is on other hard drives.
Author
Owner

@MParvin commented on GitHub (Jun 30, 2025):

I found models in the following path(Fedora 42) :

/home/MY_USERNAME/.local/share/docker/volumes/go-developer_ollama_data/_data/models/blobs/

<!-- gh-comment-id:3019356163 --> @MParvin commented on GitHub (Jun 30, 2025): I found models in the following path(Fedora 42) : /home/MY_USERNAME/.local/share/docker/volumes/go-developer_ollama_data/_data/models/blobs/
Author
Owner

@Lukem121 commented on GitHub (Feb 6, 2026):

https://docs.ollama.com/faq#where-are-models-stored

<!-- gh-comment-id:3858345631 --> @Lukem121 commented on GitHub (Feb 6, 2026): https://docs.ollama.com/faq#where-are-models-stored
Author
Owner

@nine1one commented on GitHub (Mar 30, 2026):

I use Ollama installed directly on Windows and also inside WSL. I figured out how to use both the Windows models path and the Ubuntu models path seamlessly if you are in a situation like mine. I came across this closed issue by chance, and I’m sharing what worked for me, hoping it might help others.

Paths:

  • On Windows Explorer [WSL Running]: \\wsl.localhost\Ubuntu\usr\share\ollama\.ollama\models\
  • Inside WSL Ubuntu default Ollama models path: /usr/share/ollama/.ollama/models
  • Inside WSL accessing Windows filesystem: /mnt/c/Users/<windows-username>/.ollama/models

Steps to set environment variables in WSL Ubuntu:

  1. Open the systemd editor for the Ollama service:
sudo systemctl edit ollama.service
  1. Add your desired environment variables under the [Service] section exactly between the comments as shown:
### Editing /etc/systemd/system/ollama.service.d/override.conf
### Anything between here and the comment below will become the contents of the drop-in file

[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_MODELS=/mnt/c/Users/<windows-username>/.ollama/models"

### Edits below this comment will be discarded

Since I don’t use WSL all the time, I also keep Ollama installed outside of WSL. This setup lets Ollama on both Windows and WSL access models from a single location—the Windows default—without creating duplicates, saving storage.

<!-- gh-comment-id:4153566869 --> @nine1one commented on GitHub (Mar 30, 2026): I use Ollama installed directly on Windows and also inside WSL. I figured out how to use both the Windows models path and the Ubuntu models path seamlessly if you are in a situation like mine. I came across this closed issue by chance, and I’m sharing what worked for me, hoping it might help others. **Paths:** * On Windows Explorer [WSL Running]: `\\wsl.localhost\Ubuntu\usr\share\ollama\.ollama\models\` * Inside WSL Ubuntu default Ollama models path: `/usr/share/ollama/.ollama/models` * Inside WSL accessing Windows filesystem: `/mnt/c/Users/<windows-username>/.ollama/models` **Steps to set environment variables in WSL Ubuntu:** 1. Open the systemd editor for the Ollama service: ```bash sudo systemctl edit ollama.service ```` 2. Add your desired environment variables under the `[Service]` section exactly between the comments as shown: ```ini ### Editing /etc/systemd/system/ollama.service.d/override.conf ### Anything between here and the comment below will become the contents of the drop-in file [Service] Environment="OLLAMA_HOST=0.0.0.0" Environment="OLLAMA_MODELS=/mnt/c/Users/<windows-username>/.ollama/models" ### Edits below this comment will be discarded ``` Since I don’t use WSL all the time, I also keep Ollama installed outside of WSL. This setup lets Ollama on both Windows and WSL access models from a single location—the Windows default—without creating duplicates, saving storage.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#46852