[GH-ISSUE #12456] Support binding multiple IPs #54789

Open
opened 2026-04-29 07:18:36 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @ryanisn on GitHub (Sep 30, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12456

I have web running in local docker that calls to ollama, and I also need to call ollama directly from host.
To secure the ollama API access, I want to bind ip to both localhost and host.docker.internal (172.17.0.1).

I do not want to bind to 0.0.0.0, that would be a security concern, this ollama server is for local use only.

Originally created by @ryanisn on GitHub (Sep 30, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12456 I have web running in local docker that calls to ollama, and I also need to call ollama directly from host. To secure the ollama API access, I want to bind ip to both localhost and host.docker.internal (172.17.0.1). I do not want to bind to 0.0.0.0, that would be a security concern, this ollama server is for local use only.
GiteaMirror added the feature request label 2026-04-29 07:18:36 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 30, 2025):

Put ollama in a container and adjust the port settings:

services:
  ollama:
    image: ollama/ollama:latest
    ports:
      - 127.0.0.1:11434:11434
      - 172.17.0.1:11434:11434
<!-- gh-comment-id:3352820111 --> @rick-github commented on GitHub (Sep 30, 2025): Put ollama in a container and adjust the port settings: ```yaml services: ollama: image: ollama/ollama:latest ports: - 127.0.0.1:11434:11434 - 172.17.0.1:11434:11434 ```
Author
Owner

@ryanisn commented on GitHub (Sep 30, 2025):

Put ollama in a container and adjust the port settings:

services:
ollama:
image: ollama/ollama:latest
ports:
- 127.0.0.1:11434:11434
- 172.17.0.1:11434:11434

I'd prefer to run ollama locally, docker service is not always up

<!-- gh-comment-id:3352833836 --> @ryanisn commented on GitHub (Sep 30, 2025): > Put ollama in a container and adjust the port settings: > > services: > ollama: > image: ollama/ollama:latest > ports: > - 127.0.0.1:11434:11434 > - 172.17.0.1:11434:11434 I'd prefer to run ollama locally, docker service is not always up
Author
Owner

@rick-github commented on GitHub (Sep 30, 2025):

Add a firewall rule to drop packets to 11434 that come from the LAN/WAN.

<!-- gh-comment-id:3352888007 --> @rick-github commented on GitHub (Sep 30, 2025): Add a firewall rule to drop packets to 11434 that come from the LAN/WAN.
Author
Owner

@vanboom commented on GitHub (Oct 1, 2025):

Firewall rule would work but nobody does this for NGINX or Apache configurations. Is it not feasible to add support similar to NGINX "Listen" for multiple IP address?

<!-- gh-comment-id:3358310973 --> @vanboom commented on GitHub (Oct 1, 2025): Firewall rule would work but nobody does this for NGINX or Apache configurations. Is it not feasible to add support similar to NGINX "Listen" for multiple IP address?
Author
Owner

@rick-github commented on GitHub (Oct 1, 2025):

Run nginx as a reverse proxy, listening on multiple addresses and forwarding requests to an ollama server listening on 127.0.0.1:11435.

<!-- gh-comment-id:3358317958 --> @rick-github commented on GitHub (Oct 1, 2025): Run nginx as a reverse proxy, listening on multiple addresses and forwarding requests to an ollama server listening on 127.0.0.1:11435.
Author
Owner

@rick-github commented on GitHub (Oct 1, 2025):

To be clear, this is a valid feature request. I'm suggesting ways to achieve the required outcome now without waiting for a PR.

<!-- gh-comment-id:3358325199 --> @rick-github commented on GitHub (Oct 1, 2025): To be clear, this is a valid feature request. I'm suggesting ways to achieve the required outcome now without waiting for a PR.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#54789