mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-13 10:36:24 -05:00
Lazy loading connections to prevent blocking the UI #634
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @bnthn on GitHub (Apr 11, 2024).
Is your feature request related to a problem? Please describe.
From my understanding the UI gets built after the available models got pulled from OpenAI and Ollama. Because of this, When Ollama is not reachable via the URL (because it is not running eg.) , the UI gets build after the connection has timeouted or was refused several times. In the examples below you can see that I have to wait a whopping 5 minutes in the worst-case scenario before even the UI shows up or i can use OpenAIs GPT models (which should be available anyways).
Feel free to let me know if this is just a problem on my end due to misconfiguration or if there is already an issue like this opened (couldn't find any).
Ollama server is powered off -> connection gets refused a couple times, UI loads after ~5-10 seconds.
Ollama server is powered on, but Ollama service is not running -> connection gets timeouted a couple times, UI loads after ~4-5 minutes.
Describe the solution you'd like
Lazy / Asynchronous Loading of the models inside the UI after Login. OpenAIs models get loaded quickly and the Ollama models get a timeout (if not available) while i can simultaneously work.
Describe alternatives you've considered
Most of the time, either my tower PC (Ollama host with thicc gpu) is powered on so my little homelab server (openwebui host) has access to Ollama (still a couple seconds wait time) or the tower PC is powered off, so that the UI is usable after about 4-5 seconds (see above). So the 4-5 minute wait time scenario happens rarely, if ever, it was just a coincidental find. Still not the cleanest possible solution.
That said, this is one of the most amazing projects i have seen in the last couple of months, let alone in the field of generative AI for selfhosting. Thanks for consideration!
@bnthn commented on GitHub (Apr 11, 2024):
#1461
@tjbck commented on GitHub (Apr 12, 2024):
Closing in favour of #1288