[GH-ISSUE #3951] Can we still access ollama :11434 port with docker cuda AIO installs? #13442

Closed
opened 2026-04-19 20:10:57 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @c2h2 on GitHub (Jul 17, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/3951

I installed this wonderful software with cuda docker AIO install command. but this install only expose 3000 port, not 11434 port which I can still interact with other software like langchain Ollama() stuff.

I have tried launch the docker with this port exposed, but ollama seems not able to accpet non-localhost requests.

Can we still access the 11434 from outside docker?

Originally created by @c2h2 on GitHub (Jul 17, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/3951 I installed this wonderful software with cuda docker AIO install command. but this install only expose 3000 port, not 11434 port which I can still interact with other software like langchain Ollama() stuff. I have tried launch the docker with this port exposed, but ollama seems not able to accpet non-localhost requests. Can we still access the 11434 from outside docker?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#13442