[GH-ISSUE #3041] Models disappear when changing the OLLAMA_HOST to 0.0.0.0 #1871

Closed
opened 2026-04-12 11:56:46 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @Howe829 on GitHub (Mar 10, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3041

Hi there! I am gonna let devices access ollama through LAN therefore I set the OLLAMA_HOST=0.0.0.0
When I restart ollama, the models I pulled before disappeared, I don't know whether it is a bug or something else.
In addition, I think we need a 'restart' command to restart the server.
Thanks for help in advance.
Ollama:0.1.28
My OS info: Ubuntu 23.04 Linux-6.2.0-39-generic

Originally created by @Howe829 on GitHub (Mar 10, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3041 Hi there! I am gonna let devices access ollama through LAN therefore I set the OLLAMA_HOST=0.0.0.0 When I restart ollama, the models I pulled before disappeared, I don't know whether it is a bug or something else. In addition, I think we need a 'restart' command to restart the server. Thanks for help in advance. Ollama:0.1.28 My OS info: Ubuntu 23.04 Linux-6.2.0-39-generic
Author
Owner

@aosan commented on GitHub (Mar 11, 2024):

Are the models becoming not available via the network or via the console?

Ollama can be restarted with systemctl restart ollama.service

<!-- gh-comment-id:1988523483 --> @aosan commented on GitHub (Mar 11, 2024): Are the models becoming not available via the network or via the console? Ollama can be restarted with `systemctl restart ollama.service`
Author
Owner

@Howe829 commented on GitHub (Mar 11, 2024):

Hello @aosan , the models are not shown from the ollama list, and when I calling the /generate api, it returns 404.
Thanks for the restart command, but it seems have no effect with the OLLAMA_HOST set. The host still stay at 127.0.0.1, when I set the OLLAMA_HOST to 0.0.0.0 and using systemctl restart ollama.service to restart ollama.

The host turns to "[::]" when I stop ollama by using systemctl stop ollama and start it using ollama serve.

<!-- gh-comment-id:1988671222 --> @Howe829 commented on GitHub (Mar 11, 2024): Hello @aosan , the models are not shown from the `ollama list`, and when I calling the /generate api, it returns 404. Thanks for the restart command, but it seems have no effect with the OLLAMA_HOST set. The host still stay at 127.0.0.1, when I set the OLLAMA_HOST to 0.0.0.0 and using `systemctl restart ollama.service` to restart ollama. The host turns to "[::]" when I stop ollama by using `systemctl stop ollama` and start it using `ollama serve`.
Author
Owner

@aosan commented on GitHub (Mar 11, 2024):

are you defining OLLAMA_HOST=0.0.0.0 in /etc/systemd/system/ollama.service ?

After you try to put the config in ollama.service, one way to confirm it listens on 0.0.0.0 on IPv4 is to check it with:

sudo netstat -tapen | grep ollama

and it should show you something like:

tcp 0 0 0.0.0.0:11434 0.0.0.0:* LISTEN 968 291784 35171/ollama

If you see ::, it might be IPv6, where the same netstat command would show you something like this:

tcp6 0 0 ::1:11434 :::* LISTEN 0 14221 2114/ollama

I hope you can get it working, these configuration fogs can be frustrating.

<!-- gh-comment-id:1988752513 --> @aosan commented on GitHub (Mar 11, 2024): are you defining `OLLAMA_HOST=0.0.0.0` in `/etc/systemd/system/ollama.service` ? After you try to put the config in ollama.service, one way to confirm it listens on 0.0.0.0 on IPv4 is to check it with: sudo netstat -tapen | grep ollama and it should show you something like: `tcp 0 0 0.0.0.0:11434 0.0.0.0:* LISTEN 968 291784 35171/ollama` If you see ::, it might be IPv6, where the same netstat command would show you something like this: `tcp6 0 0 ::1:11434 :::* LISTEN 0 14221 2114/ollama` I hope you can get it working, these configuration fogs can be frustrating.
Author
Owner

@jmorganca commented on GitHub (Mar 11, 2024):

Hi there, when changing OLLAMA_HOST on Linux it's important to do this in the systemd service, since Ollama runs as it's own process and user in the system. You can do that by following this guide. Let me know if that helps!

<!-- gh-comment-id:1989376764 --> @jmorganca commented on GitHub (Mar 11, 2024): Hi there, when changing `OLLAMA_HOST` on Linux it's important to do this in the systemd service, since Ollama runs as it's own process and user in the system. You can do that by following [this guide](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux). Let me know if that helps!
Author
Owner

@Howe829 commented on GitHub (Mar 12, 2024):

Thank you very much for your help!

<!-- gh-comment-id:1989740157 --> @Howe829 commented on GitHub (Mar 12, 2024): Thank you very much for your help!
Author
Owner

@PittYao commented on GitHub (Mar 13, 2024):

what about windows?

<!-- gh-comment-id:1993087678 --> @PittYao commented on GitHub (Mar 13, 2024): what about windows?
Author
Owner

@Howe829 commented on GitHub (Mar 13, 2024):

@PittYao You can check this guide out

<!-- gh-comment-id:1993262059 --> @Howe829 commented on GitHub (Mar 13, 2024): @PittYao You can check [this guide](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-windows) out
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1871