issue: If server has no access to internet - page loading takes around 10 seconds due to OpenAI timeout #4718

Closed
opened 2025-11-11 16:01:23 -06:00 by GiteaMirror · 2 comments
Owner

Originally created by @PolarNick239 on GitHub (Apr 4, 2025).

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

v0.6.0

Ollama Version (if applicable)

0.6.2

Operating System

Ubuntu 22.04

Browser (if applicable)

Brave Version 1.77.95 Chromium: 135.0.7049.52

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

If the server with ollama + open-webui has no internet access - everything works fine (opening webui from other computers in LAN).

Actual Behavior

If Internet is in-accessible - each page loads very slow - in ~10 seconds.
Logs on open-webui arount this delay looks like:

Apr 04 16:36:16 username open-webui[1010]: 2025-04-04 16:36:16.686 | INFO | open_webui.routers.openai:get_all_models:389 - get_all_models() - {}
Apr 04 16:36:26 username open-webui[1010]: 2025-04-04 16:36:26.775 | ERROR | open_webui.routers.openai:send_get_request:81 - Connection error: - {}

Steps to Reproduce

  1. Launch ollama service on server (computer A)
  2. Launch open-webui service on server (computer A)
  3. Ensure that you can access open-webui from another computer B in LAN just fine (page loads quickly)
  4. On computer A disable access to internet (without blocking LAN access):
# Allow connections in LAN
sudo ufw allow from 192.168.0.0/16
sudo ufw allow out to 192.168.0.0/16
# Block any connections with outer internet
sudo ufw default deny incoming
sudo ufw default deny outgoing
sudo ufw enable
  1. Reboot to make sure firewall configuration is applied
  2. Launch ollama + open-webui services
  3. Try to access open-webui from another computer B in LAN - it will be slow and the page will be loaded in 10 seconds

Logs & Screenshots

If Internet is in-accessible - each page loads very slow - in ~10 seconds.
Logs on open-webui arount this delay looks like:

Apr 04 16:36:16 username open-webui[1010]: 2025-04-04 16:36:16.686 | INFO | open_webui.routers.openai:get_all_models:389 - get_all_models() - {}
Apr 04 16:36:26 username open-webui[1010]: 2025-04-04 16:36:26.775 | ERROR | open_webui.routers.openai:send_get_request:81 - Connection error: - {}

Additional Information

Workaround:

Open Admin Panel -> Settings -> Connections -> disable OpenAI API:

Image

Originally created by @PolarNick239 on GitHub (Apr 4, 2025). ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version v0.6.0 ### Ollama Version (if applicable) 0.6.2 ### Operating System Ubuntu 22.04 ### Browser (if applicable) Brave Version 1.77.95 Chromium: 135.0.7049.52 ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior If the server with ollama + open-webui has no internet access - everything works fine (opening webui from other computers in LAN). ### Actual Behavior If Internet is in-accessible - each page loads very slow - in ~10 seconds. Logs on open-webui arount this delay looks like: Apr 04 16:36:**16** username open-webui[1010]: 2025-04-04 16:36:16.686 | INFO | open_webui.routers.openai:get_all_models:389 - get_all_models() - {} Apr 04 16:36:**26** username open-webui[1010]: 2025-04-04 16:36:26.775 | ERROR | open_webui.routers.openai:send_get_request:81 - Connection error: - {} ### Steps to Reproduce 1. Launch ollama service on server (computer A) 2. Launch open-webui service on server (computer A) 3. Ensure that you can access open-webui from another computer B in LAN just fine (**page loads quickly**) 4. On computer A disable access to internet (without blocking LAN access): ``` # Allow connections in LAN sudo ufw allow from 192.168.0.0/16 sudo ufw allow out to 192.168.0.0/16 # Block any connections with outer internet sudo ufw default deny incoming sudo ufw default deny outgoing sudo ufw enable ``` 5. Reboot to make sure firewall configuration is applied 6. Launch ollama + open-webui services 7. Try to access open-webui from another computer B in LAN - **it will be slow** and the page will be loaded in 10 seconds ### Logs & Screenshots If Internet is in-accessible - each page loads very slow - in ~10 seconds. Logs on open-webui arount this delay looks like: Apr 04 16:36:**16** username open-webui[1010]: 2025-04-04 16:36:16.686 | INFO | open_webui.routers.openai:get_all_models:389 - get_all_models() - {} Apr 04 16:36:**26** username open-webui[1010]: 2025-04-04 16:36:26.775 | ERROR | open_webui.routers.openai:send_get_request:81 - Connection error: - {} ### Additional Information **Workaround:** Open Admin Panel -> Settings -> Connections -> disable OpenAI API: ![Image](https://github.com/user-attachments/assets/b0fa6ecf-e523-44ad-8464-0845d8c64ef8)
GiteaMirror added the bug label 2025-11-11 16:01:23 -06:00
Author
Owner

@tjbck commented on GitHub (Apr 4, 2025):

Intended behaviour check: https://docs.openwebui.com/getting-started/env-configuration#aiohttp_client_timeout_model_list

@tjbck commented on GitHub (Apr 4, 2025): Intended behaviour check: https://docs.openwebui.com/getting-started/env-configuration#aiohttp_client_timeout_model_list
Author
Owner

@PolarNick239 commented on GitHub (Apr 4, 2025):

So, by default, are OpenAI models queried in any way? Isn't it possible to add a simple check to see if there is an API key? Or do requests to OpenAI models make sense even if the user hasn't specified an OpenAI API key?

@PolarNick239 commented on GitHub (Apr 4, 2025): So, by default, are OpenAI models queried in any way? Isn't it possible to add a simple check to see if there is an API key? Or do requests to OpenAI models make sense even if the user hasn't specified an OpenAI API key?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#4718