[GH-ISSUE #1550] WebUI hangs when one of the Models is offline #28076

Closed
opened 2026-04-25 02:48:50 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @Joonas12334 on GitHub (Apr 14, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1550

Bug Report

Description

Bug Summary:
UI is not usable without all of the models being online.

Steps to Reproduce:
Take one model offline and see the WebUI hang.

Expected Behavior:
WebUI would only show the models that are online. If offline then not showing/showing as red text for offline models.

Actual Behavior:
WebUI not usable.

Environment

  • Operating System: Docker x64
  • Browser (if applicable): Chrome:Latest

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

##Chrome Screenshot
https://i.imgur.com/2JLkc1y.png

Docker Container Logs:
https://pastebin.com/LCuvPD18

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

Manual configuration with docker yaml

Originally created by @Joonas12334 on GitHub (Apr 14, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/1550 # Bug Report ## Description **Bug Summary:** UI is not usable without all of the models being online. **Steps to Reproduce:** Take one model offline and see the WebUI hang. **Expected Behavior:** WebUI would only show the models that are online. If offline then not showing/showing as red text for offline models. **Actual Behavior:** WebUI not usable. ## Environment - **Operating System:** Docker x64 - **Browser (if applicable):** Chrome:Latest ## Reproduction Details **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [x] I have included the Docker container logs. ## Logs and Screenshots ##Chrome Screenshot https://i.imgur.com/2JLkc1y.png **Docker Container Logs:** https://pastebin.com/LCuvPD18 **Screenshots (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Installation Method Manual configuration with docker yaml
Author
Owner

@tjbck commented on GitHub (Apr 14, 2024):

Unable to reproduce the issue on my end, but just updated the code on our dev branch. Let us know if that fixes the issue for you, keep us updated!

<!-- gh-comment-id:2054165764 --> @tjbck commented on GitHub (Apr 14, 2024): Unable to reproduce the issue on my end, but just updated the code on our dev branch. Let us know if that fixes the issue for you, keep us updated!
Author
Owner

@justinh-rahb commented on GitHub (Apr 22, 2024):

This seems to have been addressed now, I've tested in my own environments where I use multiple Ollama servers and taking one offline is not causing hangups.

<!-- gh-comment-id:2069742544 --> @justinh-rahb commented on GitHub (Apr 22, 2024): This seems to have been addressed now, I've tested in my own environments where I use multiple Ollama servers and taking one offline is not causing hangups.
Author
Owner

@sebdanielsson commented on GitHub (Apr 23, 2024):

I still got this issue. I host Open WebUI on my server with an OpenAI API endpoint plus an ollama model that's located on my laptop. When my laptop goes offline I get a blank page when trying to load the web UI. Even after the laptop goes back online. A restart is required to bring up the web UI again.

Open WebUI v0.1.120

<!-- gh-comment-id:2072395030 --> @sebdanielsson commented on GitHub (Apr 23, 2024): I still got this issue. I host Open WebUI on my server with an OpenAI API endpoint plus an ollama model that's located on my laptop. When my laptop goes offline I get a blank page when trying to load the web UI. Even after the laptop goes back online. A restart is required to bring up the web UI again. Open WebUI v0.1.120
Author
Owner

@habaneraa commented on GitHub (Apr 26, 2024):

I still got this issue. I host Open WebUI on my server with an OpenAI API endpoint plus an ollama model that's located on my laptop. When my laptop goes offline I get a blank page when trying to load the web UI. Even after the laptop goes back online. A restart is required to bring up the web UI again.

Open WebUI v0.1.120

Same issue. The browser got a blank page until it had obtained all available models. My observation is that a request to <some_base_url>/v1/models was causing the initial UI loading to be blocked. To check this in the browser, go to DevTools > Network tab, where you will find a request without a response.

<!-- gh-comment-id:2079378200 --> @habaneraa commented on GitHub (Apr 26, 2024): > I still got this issue. I host Open WebUI on my server with an OpenAI API endpoint plus an ollama model that's located on my laptop. When my laptop goes offline I get a blank page when trying to load the web UI. Even after the laptop goes back online. A restart is required to bring up the web UI again. > > Open WebUI v0.1.120 Same issue. The browser got a blank page until it had obtained all available models. My observation is that a request to `<some_base_url>/v1/models` was causing the initial UI loading to be blocked. To check this in the browser, go to DevTools > Network tab, where you will find a request without a response.
Author
Owner

@atiehamidi commented on GitHub (May 16, 2024):

I fixed it on my local. can I create a pull request ?
the issue is in => src/lib/utils/index.ts
await should be not used in promise.all. Instead of awaiting each call sequentially inside Promise.all(), it can be fired off all the requests concurrently and then await the result of Promise.all().

this function:


export const getModels = async (token: string) => {
	let models = await Promise.all([
		await getOllamaModels(token).catch((error) => {
			console.log(error);
			return null;
		}),
		await getOpenAIModels(token).catch((error) => {
			console.log(error);
			return null;
		}),
		await getLiteLLMModels(token).catch((error) => {
			console.log(error);
			return null;
		})
	]);

	models = models.filter((models) => models).reduce((a, e, i, arr) => a.concat(e), []);

	return models;
};
<!-- gh-comment-id:2115826828 --> @atiehamidi commented on GitHub (May 16, 2024): I fixed it on my local. can I create a pull request ? the issue is in => src/lib/utils/index.ts `await` should be not used in promise.all. Instead of awaiting each call sequentially inside Promise.all(), it can be fired off all the requests concurrently and then await the result of Promise.all(). this function: ``` export const getModels = async (token: string) => { let models = await Promise.all([ await getOllamaModels(token).catch((error) => { console.log(error); return null; }), await getOpenAIModels(token).catch((error) => { console.log(error); return null; }), await getLiteLLMModels(token).catch((error) => { console.log(error); return null; }) ]); models = models.filter((models) => models).reduce((a, e, i, arr) => a.concat(e), []); return models; }; ```
Author
Owner

@tjbck commented on GitHub (May 17, 2024):

@atiehamidi good catch, updated on dev branch.

<!-- gh-comment-id:2118104130 --> @tjbck commented on GitHub (May 17, 2024): @atiehamidi good catch, updated on dev branch.
Author
Owner

@tjbck commented on GitHub (May 17, 2024):

Closing in favour of #2337

<!-- gh-comment-id:2118104531 --> @tjbck commented on GitHub (May 17, 2024): Closing in favour of #2337
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#28076