WebUI could not connect to Ollama | Windows Docker #4021

Closed
opened 2025-11-11 15:44:38 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @BIN6711 on GitHub (Feb 21, 2025).

Installation Method

  • Installed using Docker on Windows 11 Home.
  • Running both Ollama and Open WebUI as Docker containers.

Environment

  • Open WebUI Version: v0.5.16 (latest)
  • Ollama Version: 0.5.11
  • Operating System: Windows 11 Home
  • Docker Version: 4.38.0 (181591) (latest)

Expected Behavior

  • Open WebUI should successfully connect to Ollama using http://192.168.100.192:11434.
  • Models should be loaded and available within Open WebUI.

Actual Behavior

  • Open WebUI fails to connect to Ollama and logs the following error:
    ERROR [open_webui.routers.ollama] Connection error: Cannot connect to host 192.168.100.192:11434 ssl:default [Connect call failed ('192.168.100.192', 11434)]
  • However, manually visiting http://192.168.100.192:11434/api/tags in the browser works and returns the available models.
  • Open WebUI only connects when using http://host.docker.internal:11434, but this only works locally and not for other devices on the network.

Reproduction Details
Steps to Reproduce:

  • Start Ollama with the following command:
    docker run -d --gpus all --name ollama -p 11434:11434 -v "D:\MYFiles\Projects\AI\Models\Ollama:/root/.ollama" --restart always ollama/ollama
  • Confirm Ollama is running:
    curl http://192.168.100.192:11434/api/tags
    This correctly returns a JSON response with available models.
  • Start Open WebUI with:
    docker run -d -p 83:8080 -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://192.168.100.192:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
  • -e OLLAMA_BASE_URL=http://192.168.100.192:11434 \
  • Open WebUI in a browser and try to access models.
  • Open WebUI logs:
    Displays the error:
    ERROR [open_webui.routers.ollama] Connection error: Cannot connect to host 192.168.100.192:11434 ssl:default [Connect call failed ('192.168.100.192', 11434)]
  • Change OLLAMA_BASE_URL to http://host.docker.internal:11434 and restart Open WebUI.
    docker rm -f open-webui docker run -d -p --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://host.docker.internal:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
    This works, but only on the local machine. Other devices cannot connect.
Originally created by @BIN6711 on GitHub (Feb 21, 2025). **Installation Method** - Installed using Docker on Windows 11 Home. - Running both Ollama and Open WebUI as Docker containers. **Environment** - Open WebUI Version: v0.5.16 (latest) - Ollama Version: 0.5.11 - Operating System: Windows 11 Home - Docker Version: 4.38.0 (181591) (latest) **Expected Behavior** - Open WebUI should successfully connect to Ollama using http://192.168.100.192:11434. - Models should be loaded and available within Open WebUI. **Actual Behavior** - Open WebUI fails to connect to Ollama and logs the following error: `ERROR [open_webui.routers.ollama] Connection error: Cannot connect to host 192.168.100.192:11434 ssl:default [Connect call failed ('192.168.100.192', 11434)]` - However, manually visiting http://192.168.100.192:11434/api/tags in the browser works and returns the available models. - Open WebUI only connects when using http://host.docker.internal:11434, but this only works locally and not for other devices on the network. **Reproduction Details** Steps to Reproduce: - Start Ollama with the following command: `docker run -d --gpus all --name ollama -p 11434:11434 -v "D:\MYFiles\Projects\AI\Models\Ollama:/root/.ollama" --restart always ollama/ollama` - Confirm Ollama is running: `curl http://192.168.100.192:11434/api/tags` ✅ This correctly returns a JSON response with available models. - Start Open WebUI with: `docker run -d -p 83:8080 -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://192.168.100.192:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main` - -e OLLAMA_BASE_URL=http://192.168.100.192:11434 \ - Open WebUI in a browser and try to access models. - Open WebUI logs: ❌ Displays the error: ERROR [open_webui.routers.ollama] Connection error: Cannot connect to host 192.168.100.192:11434 ssl:default [Connect call failed ('192.168.100.192', 11434)] - Change OLLAMA_BASE_URL to http://host.docker.internal:11434 and restart Open WebUI. `docker rm -f open-webui docker run -d -p --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://host.docker.internal:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main` ✅ This works, but only on the local machine. Other devices cannot connect.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#4021