mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
[GH-ISSUE #10622] Open WebUI won't start if remote Ollama server is offline #15963
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @sendmebits on GitHub (Feb 23, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/10622
Bug Report
Trying to login to the Open WebUI after upgrading it just sits at a blank screen with the 'OI' logo. Issue was that it was pointed at a remote Ollama server that was offline.
Installation Method
Debian 12 - python installation with local AND remote Ollama servers. Also using OpenAI connection if that matters.
Environment
Open WebUI Version: v0.5.16
Operating System: Debian 12
Confirmation:
Expected Behavior:
The browser should be able to open the Open WebUI web console even if a remote Ollama server is offline
Actual Behavior:
You cannot even get to the login screen of the OWUI (in my case after an upgrade) when the remote Ollama server is offline
I am not using Docker, I'm using the python install.
@sendmebits commented on GitHub (Feb 23, 2025):
Here is the browser log (Safari):
[Log] TypeError: Load failed (index.BqVLHKQO.js, line 1)
[Error] Unhandled Promise Rejection: TypeError: Load failed
(anonymous function) (2.Dq7oyotT.js:2827:84609)
[Error] Unhandled Promise Rejection: TypeError: undefined is not a function (near '...G of k...')
(anonymous function) (2.Dq7oyotT.js:2808:729544)
(anonymous function) (2.Dq7oyotT.js:2808:729619)
(anonymous function) (2.Dq7oyotT.js:2808:729620)
(anonymous function) (2.Dq7oyotT.js:2808:729687)
Module Code (2.Dq7oyotT.js:2808:746127)
[Log] Backend config: – {status: true, name: "Open WebUI", version: "0.5.16", …} (0.DQIPNg5c.js, line 2)
{status: true, name: "Open WebUI", version: "0.5.16", default_locale: "", oauth: {providers: {}}, …}Object
[Log] connected – "T-dYeiDLDqWuIihNAAAV" (0.DQIPNg5c.js, line 1)
[Log] user-list – {user_ids: ["5b98c725-1492-4421-9cdd-752773d6148c"]} (0.DQIPNg5c.js, line 1)
[Log] usage – {models: []} (0.DQIPNg5c.js, line 1)
[Log] user-list – {user_ids: ["5b98c725-1492-4421-9cdd-752773d6148c"]} (0.DQIPNg5c.js, line 1)
[Error] Failed to load resource: the server responded with a status of 504 () (models, line 0)
@tjbck commented on GitHub (Feb 23, 2025):
Intended behaviour here, you can set
AIOHTTP_CLIENT_TIMEOUT_OPENAI_MODEL_LISTto meet your needs.@daniel-iliesh commented on GitHub (Mar 28, 2025):
Hi @tjbck, i just got into the same error here.
Wouldn't it be better if the intended behaviour to start OpenWebUI client and show an error that one of the servers is ofline.
Why wouldn't the OpenWebUI app load if only one of the ollama servers i setup is inaccessible ?
Seems like a bug to me.