[GH-ISSUE #1737] Where is ollama storing models? #991

Closed
opened 2026-04-12 10:42:22 -05:00 by GiteaMirror · 14 comments
Owner

Originally created by @sushiselite on GitHub (Dec 29, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1737

I was under the impression that ollama stores the models locally however, when I run ollama on a different address with
OLLAMA_HOST=0.0.0.0 ollama serve, ollama list says I do not have any models installed and I need to pull again.

This issue occurs every time I change the IP/port

I have also performed the steps given in the docs

mkdir -p /etc/systemd/system/ollama.service.d
echo '[Service]' >>/etc/systemd/system/ollama.service.d/environment.conf
echo 'Environment="OLLAMA_HOST=0.0.0.0:11434"' >>/etc/systemd/system/ollama.service.d/environment.conf

running ollama serve in itself still listens only on localhost:11434, where I have models and manually changing it with OLLAMA_HOST=0.0.0.0 ollama serve, makes the models disappear

Originally created by @sushiselite on GitHub (Dec 29, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1737 I was under the impression that ollama stores the models locally however, when I run ollama on a different address with `OLLAMA_HOST=0.0.0.0 ollama serve`, ollama list says I do not have any models installed and I need to pull again. This issue occurs every time I change the IP/port I have also performed the steps given in the docs ``` mkdir -p /etc/systemd/system/ollama.service.d echo '[Service]' >>/etc/systemd/system/ollama.service.d/environment.conf echo 'Environment="OLLAMA_HOST=0.0.0.0:11434"' >>/etc/systemd/system/ollama.service.d/environment.conf ``` running `ollama serve` in itself still listens only on localhost:11434, where I have models and manually changing it with `OLLAMA_HOST=0.0.0.0 ollama serve`, makes the models disappear
Author
Owner

@sthufnagl commented on GitHub (Dec 30, 2023):

Hi,

maybe this helps: https://github.com/jmorganca/ollama/issues/1687
I figured out that the model location depends on how you start ollama. With "ollama serve" or as System Service.

<!-- gh-comment-id:1872537653 --> @sthufnagl commented on GitHub (Dec 30, 2023): Hi, maybe this helps: https://github.com/jmorganca/ollama/issues/1687 I figured out that the model location depends on how you start ollama. With "ollama serve" or as System Service.
Author
Owner

@BruceMacD commented on GitHub (Jan 2, 2024):

Hi @sushiselite, Ollama should be storing the models where it is being served from, although the directory can change.

Are you connecting to different Ollama host server, or restarting a container when the host changes by chance? I'm wondering if it could be the models stored in a container being lost when it restarts.

<!-- gh-comment-id:1873901908 --> @BruceMacD commented on GitHub (Jan 2, 2024): Hi @sushiselite, Ollama should be storing the models where it is being served from, [although the directory can change](https://github.com/jmorganca/ollama/blob/main/docs/faq.md#where-are-models-stored). Are you connecting to different Ollama host server, or restarting a container when the host changes by chance? I'm wondering if it could be the models stored in a container being lost when it restarts.
Author
Owner

@sushiselite commented on GitHub (Jan 2, 2024):

Hey @BruceMacD!

Ollama seems to have different models for when run with a simple ollama serve on localhost and when run with OLLAMA_HOST=0.0.0.0 ollama serve

<!-- gh-comment-id:1874329827 --> @sushiselite commented on GitHub (Jan 2, 2024): Hey @BruceMacD! Ollama seems to have different models for when run with a simple `ollama serve` on localhost and when run with `OLLAMA_HOST=0.0.0.0 ollama serve`
Author
Owner

@technovangelist commented on GitHub (Jan 2, 2024):

When you run ollama serve on the command line, you are running as your user. Then your models will be in ~/.ollama. When you described your issue at the top, you mentioned that you created /etc/systemd/system/ollama.service.d/environment.conf, which updates the service which is going to save the models to /usr/share/ollama/.ollama. The reason it saves there is that the service runs ollama as the user ollama. But then you ran ollama serve. To use the new service configuration, it would be better to restart the service: sudo systemctl restart ollama. Does that make sense?

<!-- gh-comment-id:1874659006 --> @technovangelist commented on GitHub (Jan 2, 2024): When you run `ollama serve` on the command line, you are running as your user. Then your models will be in ~/.ollama. When you described your issue at the top, you mentioned that you created `/etc/systemd/system/ollama.service.d/environment.conf`, which updates the service which is going to save the models to /usr/share/ollama/.ollama. The reason it saves there is that the service runs ollama as the user `ollama`. But then you ran `ollama serve`. To use the new service configuration, it would be better to restart the service: `sudo systemctl restart ollama`. Does that make sense?
Author
Owner

@Th3Rom3 commented on GitHub (Jan 3, 2024):

@technovangelist I appreciate the detailed reply. Maybe this might be worth amending in the respective FAQ section?

It might be obvious to those being well versed in how those services behave but it might not hurt to mention it. Although I can also get behind keeping documentation lean.

<!-- gh-comment-id:1874974753 --> @Th3Rom3 commented on GitHub (Jan 3, 2024): @technovangelist I appreciate the detailed reply. Maybe this might be worth amending in the respective [FAQ section](https://github.com/jmorganca/ollama/blob/main/docs/faq.md#where-are-models-stored)? It might be obvious to those being well versed in how those services behave but it might not hurt to mention it. Although I can also get behind keeping documentation lean.
Author
Owner

@marianoarga commented on GitHub (Jan 8, 2024):

Setting the env variable is not working for me, it keeps storing the models on /usr/share...

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin OLLAMA_MODELS=/opt/ai/models"

[Install]
WantedBy=default.target
<!-- gh-comment-id:1881400779 --> @marianoarga commented on GitHub (Jan 8, 2024): Setting the env variable is not working for me, it keeps storing the models on /usr/share... ``` [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin OLLAMA_MODELS=/opt/ai/models" [Install] WantedBy=default.target ```
Author
Owner

@marianoarga commented on GitHub (Jan 8, 2024):

Worked putting env variables in service.d/

<!-- gh-comment-id:1881521107 --> @marianoarga commented on GitHub (Jan 8, 2024): Worked putting env variables in service.d/
Author
Owner

@sthufnagl commented on GitHub (Jan 8, 2024):

I did it manuelly with "export OLLAMA_MODELS=/usr/share/ollama/.ollama/models" before I start "ollama serve"
and it worked.

WIth other Environment Variables I do it like this and it worked also:

[Service]
Environment="OLLAMA_HOST=192.168.0.29:11434"
Environment="OLLAMA_ORIGINS=*"

<!-- gh-comment-id:1881586397 --> @sthufnagl commented on GitHub (Jan 8, 2024): I did it manuelly with "export OLLAMA_MODELS=/usr/share/ollama/.ollama/models" before I start "ollama serve" and it worked. WIth other Environment Variables I do it like this and it worked also: [Service] Environment="OLLAMA_HOST=192.168.0.29:11434" Environment="OLLAMA_ORIGINS=*"
Author
Owner

@ganakee commented on GitHub (Jan 26, 2024):

@marianoarga I looked at your ollama.service file.

In the service section, there is one:
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin OLLAMA_MODELS=/opt/ai/models"

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"

This should be:
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
Environment="OLLAMA_MODELS=/opt/ai/models"

Note the two environment entries. I had a similar issue with HSA Override. The ollama.service file takes multiple environment entries. If one tries to pack them all into one, however, systemd ignores the surplus. So, in the quoted file, systemd ignores everything after the path entries. I received no error indicating this in my instance.

<!-- gh-comment-id:1912522165 --> @ganakee commented on GitHub (Jan 26, 2024): @marianoarga I looked at your ollama.service file. In the service section, there is one: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin OLLAMA_MODELS=/opt/ai/models" > > [Service] > ExecStart=/usr/local/bin/ollama serve > User=ollama > Group=ollama > Restart=always > RestartSec=3 > Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" > ``` This should be: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" Environment="OLLAMA_MODELS=/opt/ai/models" Note the two environment entries. I had a similar issue with HSA Override. The ollama.service file takes multiple environment entries. If one tries to pack them all into one, however, systemd ignores the surplus. So, in the quoted file, systemd ignores everything after the path entries. I received no error indicating this in my instance.
Author
Owner

@jmorganca commented on GitHub (Feb 20, 2024):

@sushiselite on linux this can be set using: https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux

Let me know if this helps!

<!-- gh-comment-id:1953348365 --> @jmorganca commented on GitHub (Feb 20, 2024): @sushiselite on linux this can be set using: https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux Let me know if this helps!
Author
Owner

@markelzhang commented on GitHub (Mar 3, 2024):

@marianoarga I looked at your ollama.service file.

In the service section, there is one: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin OLLAMA_MODELS=/opt/ai/models"

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"

This should be: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" Environment="OLLAMA_MODELS=/opt/ai/models"

Note the two environment entries. I had a similar issue with HSA Override. The ollama.service file takes multiple environment entries. If one tries to pack them all into one, however, systemd ignores the surplus. So, in the quoted file, systemd ignores everything after the path entries. I received no error indicating this in my instance.

but when I change the config flie /etc/systemd/system/ollama.service like this:

Environment="OLLAMA_MODELS=/data/models"

And try to restart ollama, I got Error could not connect to ollama app. When I try ollama serve, it failed pending at source=routes.go:1042 msg="no GPU detected".

To prove it, I comment the OLLAMA_MODELS line. It works again.

Anyone know how to slove it?

<!-- gh-comment-id:1975100645 --> @markelzhang commented on GitHub (Mar 3, 2024): > @marianoarga I looked at your ollama.service file. > > In the service section, there is one: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin OLLAMA_MODELS=/opt/ai/models" > > > [Service] > > ExecStart=/usr/local/bin/ollama serve > > User=ollama > > Group=ollama > > Restart=always > > RestartSec=3 > > Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" > > > > > This should be: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" Environment="OLLAMA_MODELS=/opt/ai/models" > > Note the two environment entries. I had a similar issue with HSA Override. The ollama.service file takes multiple environment entries. If one tries to pack them all into one, however, systemd ignores the surplus. So, in the quoted file, systemd ignores everything after the path entries. I received no error indicating this in my instance. but when I change the config flie `/etc/systemd/system/ollama.service` like this: ```bash Environment="OLLAMA_MODELS=/data/models" ``` And try to restart ollama, I got Error `could not connect to ollama app`. When I try `ollama serve`, it failed pending at `source=routes.go:1042 msg="no GPU detected"`. To prove it, I comment the OLLAMA_MODELS line. It works again. Anyone know how to slove it?
Author
Owner

@markelzhang commented on GitHub (Mar 3, 2024):

@marianoarga I looked at your ollama.service file.
In the service section, there is one: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin OLLAMA_MODELS=/opt/ai/models"

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"

This should be: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" Environment="OLLAMA_MODELS=/opt/ai/models"
Note the two environment entries. I had a similar issue with HSA Override. The ollama.service file takes multiple environment entries. If one tries to pack them all into one, however, systemd ignores the surplus. So, in the quoted file, systemd ignores everything after the path entries. I received no error indicating this in my instance.

but when I change the config flie /etc/systemd/system/ollama.service like this:

Environment="OLLAMA_MODELS=/data/models"

And try to restart ollama, I got Error could not connect to ollama app. When I try ollama serve, it failed pending at source=routes.go:1042 msg="no GPU detected".

To prove it, I comment the OLLAMA_MODELS line. It works again.

Anyone know how to slove it?

After try several times, I got the reason is dir permission. Slove like this sudo chown ollama:ollama /data/models.

<!-- gh-comment-id:1975101562 --> @markelzhang commented on GitHub (Mar 3, 2024): > > @marianoarga I looked at your ollama.service file. > > In the service section, there is one: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin OLLAMA_MODELS=/opt/ai/models" > > > [Service] > > > ExecStart=/usr/local/bin/ollama serve > > > User=ollama > > > Group=ollama > > > Restart=always > > > RestartSec=3 > > > Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" > > > > > > > > > > > > > This should be: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" Environment="OLLAMA_MODELS=/opt/ai/models" > > Note the two environment entries. I had a similar issue with HSA Override. The ollama.service file takes multiple environment entries. If one tries to pack them all into one, however, systemd ignores the surplus. So, in the quoted file, systemd ignores everything after the path entries. I received no error indicating this in my instance. > > but when I change the config flie `/etc/systemd/system/ollama.service` like this: > > ```shell > Environment="OLLAMA_MODELS=/data/models" > ``` > > And try to restart ollama, I got Error `could not connect to ollama app`. When I try `ollama serve`, it failed pending at `source=routes.go:1042 msg="no GPU detected"`. > > To prove it, I comment the OLLAMA_MODELS line. It works again. > > Anyone know how to slove it? After try several times, I got the reason is dir permission. Slove like this `sudo chown ollama:ollama /data/models`.
Author
Owner

@abb128 commented on GitHub (Oct 18, 2024):

For anyone arriving from Google, check /usr/share/ollama/.ollama/models or /var/lib/ollama/.ollama/models as was the case for me on Arch. The model files are stored in the blobs subdirectory with unhelpful names but you can sort by largest, and inspect a file's contents to find its name like so:

hexdump -C -n 256 /var/lib/ollama/.ollama/models/blobs/sha256-8eeb52dfb3bb9aefdf9d1ef24b3bdbcfbe82238798c4b918278320b6fcef18fe

You can serve them with llama-server as they're gguf files, useful in case you need some features ollama is missing (e.g. batching, grammar, completion)

./build/bin/llama-server -m /var/lib/ollama/.ollama/models/blobs/sha256-8eeb52dfb3bb9aefdf9d1ef24b3bdbcfbe82238798c4b918278320b6fcef18fe -c 2048
<!-- gh-comment-id:2423065403 --> @abb128 commented on GitHub (Oct 18, 2024): For anyone arriving from Google, check `/usr/share/ollama/.ollama/models` or `/var/lib/ollama/.ollama/models` as was the case for me on Arch. The model files are stored in the `blobs` subdirectory with unhelpful names but you can sort by largest, and inspect a file's contents to find its name like so: ``` hexdump -C -n 256 /var/lib/ollama/.ollama/models/blobs/sha256-8eeb52dfb3bb9aefdf9d1ef24b3bdbcfbe82238798c4b918278320b6fcef18fe ``` You can serve them with llama-server as they're gguf files, useful in case you need some features ollama is missing (e.g. batching, grammar, completion) ``` ./build/bin/llama-server -m /var/lib/ollama/.ollama/models/blobs/sha256-8eeb52dfb3bb9aefdf9d1ef24b3bdbcfbe82238798c4b918278320b6fcef18fe -c 2048 ```
Author
Owner

@mirao commented on GitHub (Dec 23, 2024):

@abb128 thanks.

On Ubuntu 24.04, it's in /usr/share/ollama/.ollama/models. I installed ollama using https://ollama.com/download and it's started as a service.
A sudo access (to become ollama or root) is needed to read it manually.

$ ls -ld /usr/share/ollama/
drwxr-x--- 4 ollama ollama 4096 Dec 23 09:53 /usr/share/ollama/
<!-- gh-comment-id:2559241709 --> @mirao commented on GitHub (Dec 23, 2024): @abb128 thanks. On Ubuntu 24.04, it's in `/usr/share/ollama/.ollama/models`. I installed ollama using https://ollama.com/download and it's started as a service. A sudo access (to become `ollama` or `root`) is needed to read it manually. ```bash $ ls -ld /usr/share/ollama/ drwxr-x--- 4 ollama ollama 4096 Dec 23 09:53 /usr/share/ollama/ ````
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#991