OpenAI (local server) going offline brings Open-WebUI down #4225

Closed
opened 2025-11-11 15:49:06 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @frenzybiscuit on GitHub (Mar 4, 2025).

Installation Method

Python 3.11 with pip/venv

Environment

  • Open WebUI Version: 0.5.18

  • Operating System: Debian 12

  • Browser (if applicable): Firefox (MacOS)

Confirmation:

  • [ X] I have read and followed all the instructions provided in the README.md.
  • [ X] I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • [ X] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

Works as expected

Actual Behavior:

When the OpenAI (locally hosted) API endpoint goes down, it brings down Open-WebUI with it. Open-WebUI does not return until the OpenAI endpoint does.

Description

Bug Summary:
See above

Reproduction Details

Steps to Reproduce:
Create a OpenAI endpoint like so:

Image

Take the OpenAI endpoint offline (shut the OpenAI compatible server down).

Watch Open-WebUI turn into a white page on page refresh and refuse to load anything until the OpenAI endpoint is back up.

Watch open-webui logs, which will show the following:

2025-03-03 17:17:59.554 | INFO | open_webui.routers.openai:get_all_models:379 - get_all_models() - {}
2025-03-03 17:20:09.640 | ERROR | open_webui.routers.openai:send_get_request:78 - Connection error: Cannot connect to host 192.168.0.200:5000 ssl:default [Connect call failed ('192.168.0.200', 5000)] - {}

Additional Information

Open-WebUI DOES NOT come back online until the OpenAI endpoint is back online! Not even the admin section

Originally created by @frenzybiscuit on GitHub (Mar 4, 2025). ## Installation Method Python 3.11 with pip/venv ## Environment - **Open WebUI Version:** 0.5.18 - **Operating System:** Debian 12 - **Browser (if applicable):** Firefox (MacOS) **Confirmation:** - [ X] I have read and followed all the instructions provided in the README.md. - [ X] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. - [ X] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: Works as expected ## Actual Behavior: When the OpenAI (locally hosted) API endpoint goes down, it brings down Open-WebUI with it. Open-WebUI does not return until the OpenAI endpoint does. ## Description **Bug Summary:** See above ## Reproduction Details **Steps to Reproduce:** Create a OpenAI endpoint like so: <img width="1066" alt="Image" src="https://github.com/user-attachments/assets/cfd203b8-1738-474b-a107-620eca20f4ce" /> Take the OpenAI endpoint offline (shut the OpenAI compatible server down). Watch Open-WebUI turn into a white page on page refresh and refuse to load anything until the OpenAI endpoint is back up. Watch open-webui logs, which will show the following: 2025-03-03 17:17:59.554 | INFO | open_webui.routers.openai:get_all_models:379 - get_all_models() - {} 2025-03-03 17:20:09.640 | ERROR | open_webui.routers.openai:send_get_request:78 - Connection error: Cannot connect to host 192.168.0.200:5000 ssl:default [Connect call failed ('192.168.0.200', 5000)] - {} ## Additional Information Open-WebUI DOES NOT come back online until the OpenAI endpoint is back online! Not even the admin section
Author
Owner
@tjbck commented on GitHub (Mar 4, 2025): https://docs.openwebui.com/getting-started/env-configuration#aiohttp_client_timeout_openai_model_list
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#4225