Lazy loading connections to prevent blocking the UI #634

Closed
opened 2025-11-11 14:27:54 -06:00 by GiteaMirror · 2 comments
Owner

Originally created by @bnthn on GitHub (Apr 11, 2024).

From my understanding the UI gets built after the available models got pulled from OpenAI and Ollama. Because of this, When Ollama is not reachable via the URL (because it is not running eg.) , the UI gets build after the connection has timeouted or was refused several times. In the examples below you can see that I have to wait a whopping 5 minutes in the worst-case scenario before even the UI shows up or i can use OpenAIs GPT models (which should be available anyways).

Feel free to let me know if this is just a problem on my end due to misconfiguration or if there is already an issue like this opened (couldn't find any).

Ollama server is powered off -> connection gets refused a couple times, UI loads after ~5-10 seconds.

ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [No route to host]
ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [No route to host]
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [No route to host]
ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [No route to host]

Ollama server is powered on, but Ollama service is not running -> connection gets timeouted a couple times, UI loads after ~4-5 minutes.

ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out]
ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out]
ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out]
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out]
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out]
ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out]

Describe the solution you'd like

Lazy / Asynchronous Loading of the models inside the UI after Login. OpenAIs models get loaded quickly and the Ollama models get a timeout (if not available) while i can simultaneously work.

Describe alternatives you've considered

Most of the time, either my tower PC (Ollama host with thicc gpu) is powered on so my little homelab server (openwebui host) has access to Ollama (still a couple seconds wait time) or the tower PC is powered off, so that the UI is usable after about 4-5 seconds (see above). So the 4-5 minute wait time scenario happens rarely, if ever, it was just a coincidental find. Still not the cleanest possible solution.

That said, this is one of the most amazing projects i have seen in the last couple of months, let alone in the field of generative AI for selfhosting. Thanks for consideration!

Originally created by @bnthn on GitHub (Apr 11, 2024). ### Is your feature request related to a problem? Please describe. From my understanding the UI gets built after the available models got pulled from OpenAI and Ollama. Because of this, When Ollama is not reachable via the URL (because it is not running eg.) , the UI gets build _after_ the connection has timeouted or was refused several times. In the examples below you can see that I have to wait a whopping 5 minutes in the worst-case scenario before even the UI shows up or i can use OpenAIs GPT models (which should be available anyways). Feel free to let me know if this is just a problem on my end due to misconfiguration or if there is already an issue like this opened (couldn't find any). **Ollama server is powered off -> connection gets refused a couple times, UI loads after ~5-10 seconds.** ``` ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [No route to host] ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [No route to host] INFO:apps.ollama.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [No route to host] ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [No route to host] ``` **Ollama server is powered on, but Ollama service is not running -> connection gets timeouted a couple times, UI loads after _~4-5 minutes_**. ``` ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out] ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out] ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out] INFO:apps.ollama.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out] INFO:apps.ollama.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out] ERROR:apps.ollama.main:Connection error: Cannot connect to host <tower_ip>:11434 ssl:default [Connection timed out] ``` ### Describe the solution you'd like **Lazy / Asynchronous Loading** of the models inside the UI after Login. OpenAIs models get loaded quickly and the Ollama models get a timeout (if not available) while i can simultaneously work. ### Describe alternatives you've considered Most of the time, either my tower PC (Ollama host with thicc gpu) is powered on so my little homelab server (openwebui host) has access to Ollama (still a couple seconds wait time) or the tower PC is powered off, so that the UI is usable after about 4-5 seconds (see above). So the 4-5 minute wait time scenario happens rarely, if ever, it was just a coincidental find. Still not the cleanest possible solution. That said, this is one of the most amazing projects i have seen in the last couple of months, let alone in the field of generative AI for selfhosting. Thanks for consideration!
Author
Owner

@bnthn commented on GitHub (Apr 11, 2024):

#1461

@bnthn commented on GitHub (Apr 11, 2024): #1461
Author
Owner

@tjbck commented on GitHub (Apr 12, 2024):

Closing in favour of #1288

@tjbck commented on GitHub (Apr 12, 2024): Closing in favour of #1288
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#634