[GH-ISSUE #1735] Server doesn't listen on all available interfaces #989

Closed
opened 2026-04-12 10:41:31 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @zine999 on GitHub (Dec 28, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1735

I think this might be a problem recently introduced in v0.1.17 but I'm not 100% sure.

ollama serve doesn't listen on 0.0.0.0 and therefore doesn't make itself available on all interfaces. This causes problems when trying to connect to it via an interface other than localhost.

A (hopefully temporary) workaround is using a utility like socat, e.g. to listen on all interfaces on port 8888 and relay traffic to port 11434:

$ socat TCP-LISTEN:8888,reuseaddr,fork TCP:localhost:11434
Originally created by @zine999 on GitHub (Dec 28, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1735 I think this might be a problem recently introduced in v0.1.17 but I'm not 100% sure. `ollama serve` doesn't listen on `0.0.0.0` and therefore doesn't make itself available on all interfaces. This causes problems when trying to connect to it via an interface other than `localhost`. A (hopefully temporary) workaround is using a utility like `socat`, e.g. to listen on all interfaces on port `8888` and relay traffic to port `11434`: ``` $ socat TCP-LISTEN:8888,reuseaddr,fork TCP:localhost:11434 ```
GiteaMirror added the bug label 2026-04-12 10:41:31 -05:00
Author
Owner

@easp commented on GitHub (Jan 2, 2024):

Listen address defaults to localhost. Are you setting it to 0.0.0.0?
https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network

<!-- gh-comment-id:1874569480 --> @easp commented on GitHub (Jan 2, 2024): Listen address defaults to localhost. Are you setting it to 0.0.0.0? https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network
Author
Owner

@zine999 commented on GitHub (Jan 3, 2024):

I see, so it defaults to 127.0.0.1... but wouldn't it make more sense to have the default be 0.0.0.0 to avoid connection issues, and allow it to be overideable to something else like 127.0.0.1? I don't think there's a security issue as servers should have firewalls blocking traffic on ports anyway, and if it's run within a docker container the ports still need to be manually exposed.

<!-- gh-comment-id:1874731398 --> @zine999 commented on GitHub (Jan 3, 2024): I see, so it defaults to 127.0.0.1... but wouldn't it make more sense to have the default be 0.0.0.0 to avoid connection issues, and allow it to be overideable to something else like 127.0.0.1? I don't think there's a security issue as servers should have firewalls blocking traffic on ports anyway, and if it's run within a docker container the ports still need to be manually exposed.
Author
Owner

@easp commented on GitHub (Jan 3, 2024):

Ollama should use a secure configuration by default. The hosts firewall is another layer of defense and having multiple layers of security is consistent with the stance of defense in depth.

<!-- gh-comment-id:1874838784 --> @easp commented on GitHub (Jan 3, 2024): Ollama should use a secure configuration by default. The hosts firewall is another layer of defense and having multiple layers of security is consistent with the stance of defense in depth.
Author
Owner

@zine999 commented on GitHub (Jan 4, 2024):

🤷‍♂️ OK, I'll close the issue, thanks for pointing me to the OLLAMA_HOST env var.

<!-- gh-comment-id:1876221224 --> @zine999 commented on GitHub (Jan 4, 2024): 🤷‍♂️ OK, I'll close the issue, thanks for pointing me to the `OLLAMA_HOST` env var.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#989