[GH-ISSUE #22520] feat: Bundled Ollama not exposing port #19731

Closed
opened 2026-04-20 02:14:33 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @q20 on GitHub (Mar 10, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/22520

Check Existing Issues

  • I have searched for all existing open AND closed issues and discussions for similar requests. I have found none that is comparable to my request.

Verify Feature Scope

  • I have read through and understood the scope definition for feature requests in the Issues section. I believe my feature request meets the definition and belongs in the Issues section instead of the Discussions.

Problem Description

Deployed with ghcr.io/open-webui/open-webui:ollama
nmap shows a single port open, the open-webui port 8080

nmap 172.17.0.3
Starting Nmap 7.97 ( https://nmap.org ) at 2026-03-10 10:02 +0100
Nmap scan report for 172.17.0.3
Host is up (0.00010s latency).
Not shown: 999 closed tcp ports (conn-refused)
PORT     STATE SERVICE
8080/tcp open  http-proxy

Nmap done: 1 IP address (1 host up) scanned in 0.55 seconds

I expect to see Ollama's port (11434) exposed.

Desired Solution you'd like

Allow us to expose the Ollama port. I have tried -p 11434:11434, but nmap reveals 8080 is still the only active port.
I can find nothing in docs describing how to achieve this.

Alternatives Considered

No response

Additional Context

No response

Originally created by @q20 on GitHub (Mar 10, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/22520 ### Check Existing Issues - [x] I have searched for all existing **open AND closed** issues and discussions for similar requests. I have found none that is comparable to my request. ### Verify Feature Scope - [x] I have read through and understood the scope definition for feature requests in the Issues section. I believe my feature request meets the definition and belongs in the Issues section instead of the Discussions. ### Problem Description Deployed with ghcr.io/open-webui/open-webui:ollama nmap shows a single port open, the open-webui port 8080 ``` nmap 172.17.0.3 Starting Nmap 7.97 ( https://nmap.org ) at 2026-03-10 10:02 +0100 Nmap scan report for 172.17.0.3 Host is up (0.00010s latency). Not shown: 999 closed tcp ports (conn-refused) PORT STATE SERVICE 8080/tcp open http-proxy Nmap done: 1 IP address (1 host up) scanned in 0.55 seconds ``` I expect to see Ollama's port (11434) exposed. ### Desired Solution you'd like Allow us to expose the Ollama port. I have tried `-p 11434:11434`, but nmap reveals 8080 is still the only active port. I can find nothing in docs describing how to achieve this. ### Alternatives Considered _No response_ ### Additional Context _No response_
Author
Owner

@q20 commented on GitHub (Mar 10, 2026):

I think i have misunderstood interactions with Ollama. Here you have documented how to interact with open-webui via API, which is what I wanted to achieve:

https://docs.openwebui.com/reference/api-endpoints

I'll leave this here in case another palooka like myself asks the same question.

<!-- gh-comment-id:4030529657 --> @q20 commented on GitHub (Mar 10, 2026): I think i have misunderstood interactions with Ollama. Here you have documented how to interact with open-webui via API, which is what I wanted to achieve: https://docs.openwebui.com/reference/api-endpoints I'll leave this here in case another palooka like myself asks the same question.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#19731