mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
issue: offline ollama host prevents open-webui from loading entirely #4494
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @taylorwilsdon on GitHub (Mar 19, 2025).
Check Existing Issues
Installation Method
Docker
Open WebUI Version
0.5.20
Ollama Version (if applicable)
all
Operating System
all
Browser (if applicable)
all
Confirmation
README.md.Expected Behavior
If an ollama host configured in the
Connectionspanel becomes unresponsive, the models that it serves should become unavailable/disappear from the selection list and the connection should be marked as offline, but open-webui should continue to be responsive and usable. This is how it works withOpenAI API Connectionsbut with Ollama connections, it refuses to paint the UI if the connection cannot be established.Actual Behavior
When an enabled ollama host becomes unreachable (even just because it fell asleep), open-webui takes minutes to load and sometimes never will because the
get_all_modelstimeout is longer than the http timeout.Video confirming behavior - always reproducible:
https://github.com/user-attachments/assets/8616bfe8-89ec-4e70-ad8c-8c79ffb2e45f
@tjbck commented on GitHub (Mar 19, 2025):
AIOHTTP_CLIENT_TIMEOUT_MODEL_LISTis available to change this behaviour.f806ab0bd2should address this.@taylorwilsdon commented on GitHub (Mar 19, 2025):
Oh amazing that was literally the PR I was about to open (setting a default instead of none) you're too fast for me! :ty: