mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
[GH-ISSUE #1550] WebUI hangs when one of the Models is offline #28076
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Joonas12334 on GitHub (Apr 14, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1550
Bug Report
Description
Bug Summary:
UI is not usable without all of the models being online.
Steps to Reproduce:
Take one model offline and see the WebUI hang.
Expected Behavior:
WebUI would only show the models that are online. If offline then not showing/showing as red text for offline models.
Actual Behavior:
WebUI not usable.
Environment
Reproduction Details
Confirmation:
Logs and Screenshots
##Chrome Screenshot
https://i.imgur.com/2JLkc1y.png
Docker Container Logs:
https://pastebin.com/LCuvPD18
Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Installation Method
Manual configuration with docker yaml
@tjbck commented on GitHub (Apr 14, 2024):
Unable to reproduce the issue on my end, but just updated the code on our dev branch. Let us know if that fixes the issue for you, keep us updated!
@justinh-rahb commented on GitHub (Apr 22, 2024):
This seems to have been addressed now, I've tested in my own environments where I use multiple Ollama servers and taking one offline is not causing hangups.
@sebdanielsson commented on GitHub (Apr 23, 2024):
I still got this issue. I host Open WebUI on my server with an OpenAI API endpoint plus an ollama model that's located on my laptop. When my laptop goes offline I get a blank page when trying to load the web UI. Even after the laptop goes back online. A restart is required to bring up the web UI again.
Open WebUI v0.1.120
@habaneraa commented on GitHub (Apr 26, 2024):
Same issue. The browser got a blank page until it had obtained all available models. My observation is that a request to
<some_base_url>/v1/modelswas causing the initial UI loading to be blocked. To check this in the browser, go to DevTools > Network tab, where you will find a request without a response.@atiehamidi commented on GitHub (May 16, 2024):
I fixed it on my local. can I create a pull request ?
the issue is in => src/lib/utils/index.ts
awaitshould be not used in promise.all. Instead of awaiting each call sequentially inside Promise.all(), it can be fired off all the requests concurrently and then await the result of Promise.all().this function:
@tjbck commented on GitHub (May 17, 2024):
@atiehamidi good catch, updated on dev branch.
@tjbck commented on GitHub (May 17, 2024):
Closing in favour of #2337