[GH-ISSUE #9226] Open-WebUI takes a very long time to become usable when it cannot connect to api.openai.com #30954

Closed
opened 2026-04-25 05:04:43 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @endotronic on GitHub (Feb 2, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/9226

Bug Report

Installation Method

Docker via this docker-compose configuration:

  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    container_name: open-webui
    environment:
      OLLAMA_BASE_URL: (redacted internal URL)
      HF_HUB_OFFLINE: "1"
    volumes:
      - ./open-webui:/app/backend/data
    networks:
      - proxy
      - container_lan_only_net
    restart: unless-stopped

Environment

I used the latest tags for Open WebUI and Ollama today. I can look up what versions these are if it becomes necessary.

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

Open WebUI claims that it was designed to operate entirely offline. I expect that no matter what network I put the container on, as long as I can access the hosted web interface, I can use the UI. In my case, I provided a network with access to Ollama in another container, but no Internet access.

Actual Behavior:

If I don't provide internet access, I get a blank page after login. If I add a network to my docker container that has an internet gateway, everything works great right away. The blank page eventually becomes usable when a call to api.openai.com times out.

Description

Bug Summary:
I can see in Docker logs that a call is made to api.openai.com that times out. Right after login, I see a blank page for a long time while waiting on this call. When it eventually times out, the UI is usable.

Reproduction Details

Steps to Reproduce:

  1. Set up Open WebUI in a Docker container Ollama access but no internet access
  2. Create and log in to the admin account
  3. Observe a blank page
  4. Wait what I expect is 2 minutes
  5. Observe a working Ollama prompt

Logs and Screenshots

Browser Console Logs:
Image

Docker Container Logs:

ollama-webui  | INFO:     connection open
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/auths/ HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/config HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/changelog HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/users/user/settings HTTP/1.1" 200 OK
ollama-webui  | INFO  [open_webui.routers.openai] get_all_models()
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/configs/banners HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/tools/ HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /ollama/api/version HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/channels/ HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /static/favicon.png HTTP/1.1" 304 Not Modified
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/users/user/settings HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/chats/all/tags HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/chats/pinned HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/folders/ HTTP/1.1" 200 OK
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/v1/chats/?page=2 HTTP/1.1" 200 OK
ollama-webui  | INFO:     ('192.168.101.101', 0) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket" [accepted]
ollama-webui  | INFO:     connection open
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/version/updates HTTP/1.1" 200 OK
ollama-webui  | INFO:     connection closed
ollama-webui  | INFO:     connection closed
ollama-webui  | ERROR [open_webui.routers.openai] Connection error: Cannot connect to host api.openai.com:443 ssl:default [Connect call failed ('162.159.140.245', 443)]
ollama-webui  | INFO  [open_webui.routers.ollama] get_all_models()
ollama-webui  | INFO:     192.168.101.101:0 - "GET /api/models HTTP/1.1" 200 OK

Screenshots/Screen Recordings (if applicable):

Originally created by @endotronic on GitHub (Feb 2, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/9226 # Bug Report ## Installation Method Docker via this docker-compose configuration: ``` open-webui: image: ghcr.io/open-webui/open-webui:main container_name: open-webui environment: OLLAMA_BASE_URL: (redacted internal URL) HF_HUB_OFFLINE: "1" volumes: - ./open-webui:/app/backend/data networks: - proxy - container_lan_only_net restart: unless-stopped ``` ## Environment I used the latest tags for Open WebUI and Ollama today. I can look up what versions these are if it becomes necessary. **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. - [ ] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: Open WebUI claims that it was designed to operate entirely offline. I expect that no matter what network I put the container on, as long as I can access the hosted web interface, I can use the UI. In my case, I provided a network with access to Ollama in another container, but no Internet access. ## Actual Behavior: If I don't provide internet access, I get a blank page after login. If I add a network to my docker container that has an internet gateway, everything works great right away. The blank page eventually becomes usable when a call to api.openai.com times out. ## Description **Bug Summary:** I can see in Docker logs that a call is made to api.openai.com that times out. Right after login, I see a blank page for a long time while waiting on this call. When it eventually times out, the UI is usable. ## Reproduction Details **Steps to Reproduce:** 1. Set up Open WebUI in a Docker container Ollama access but no internet access 2. Create and log in to the admin account 3. Observe a blank page 4. Wait what I expect is 2 minutes 5. Observe a working Ollama prompt ## Logs and Screenshots **Browser Console Logs:** <img width="1168" alt="Image" src="https://github.com/user-attachments/assets/3bd753b4-26fb-407f-98d5-22ef724ab76d" /> **Docker Container Logs:** ``` ollama-webui | INFO: connection open ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/auths/ HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /api/config HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /api/changelog HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/users/user/settings HTTP/1.1" 200 OK ollama-webui | INFO [open_webui.routers.openai] get_all_models() ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/configs/banners HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/tools/ HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /ollama/api/version HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/channels/ HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /static/favicon.png HTTP/1.1" 304 Not Modified ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/users/user/settings HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/chats/all/tags HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/chats/pinned HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/folders/ HTTP/1.1" 200 OK ollama-webui | INFO: 192.168.101.101:0 - "GET /api/v1/chats/?page=2 HTTP/1.1" 200 OK ollama-webui | INFO: ('192.168.101.101', 0) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket" [accepted] ollama-webui | INFO: connection open ollama-webui | INFO: 192.168.101.101:0 - "GET /api/version/updates HTTP/1.1" 200 OK ollama-webui | INFO: connection closed ollama-webui | INFO: connection closed ollama-webui | ERROR [open_webui.routers.openai] Connection error: Cannot connect to host api.openai.com:443 ssl:default [Connect call failed ('162.159.140.245', 443)] ollama-webui | INFO [open_webui.routers.ollama] get_all_models() ollama-webui | INFO: 192.168.101.101:0 - "GET /api/models HTTP/1.1" 200 OK ``` **Screenshots/Screen Recordings (if applicable):**
Author
Owner

@endotronic commented on GitHub (Feb 2, 2025):

Ugh, I just realized I can turn OpenAI off in the settings, and then everything works great. I guess you can close this, but I will leave it open as a suggestion to make this more obvious.

<!-- gh-comment-id:2629226687 --> @endotronic commented on GitHub (Feb 2, 2025): Ugh, I just realized I can turn OpenAI off in the settings, and then everything works great. I guess you can close this, but I will leave it open as a suggestion to make this more obvious.
Author
Owner
<!-- gh-comment-id:2629240006 --> @tjbck commented on GitHub (Feb 2, 2025): Alternative solution: https://docs.openwebui.com/getting-started/env-configuration#aiohttp_client_timeout_openai_model_list
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#30954