mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-08 04:16:03 -05:00
[GH-ISSUE #23663] issue: Chat window freezes indefinitely when LLM returns ContentPolicyViolationError ( status: 400) #20035
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @tasawwuramd on GitHub (Apr 13, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/23663
Check Existing Issues
Installation Method
Git Clone
Open WebUI Version
0.8.12
Ollama Version (if applicable)
No response
Operating System
Linux
Browser (if applicable)
Chrome: 146.0 , Edge: 146.0.
Confirmation
README.md.Expected Behavior
The chat window exits the loading state and displays an error message to the user (e.g. "The response was filtered due to the content management policy").
Actual Behavior
The chat window freezes. The loading spinner spins indefinitely.
taskIdsis never cleared. The user must reload the page to recover.Steps to Reproduce
The error is reproducible with any upstream that returns HTTP 4xx with a JSON body (Azure content filter, guardrails, quota exceeded, etc.). Any non-2xx non-streaming response from the LLM backend will trigger this freeze.
Logs & Screenshots
Logs:
open-webui[607740]: └ ValueError(<ERROR_MESSAGES.EMPTY_CONTENT: 'The content provided is empty. Please ensure that there is text or data present be...
open-webui[607740]: raise ValueError(ERROR_MESSAGES.EMPTY_CONTENT)
Additional Information
The bug lives in two places in the backend pipeline:
1.
backend/open_webui/routers/openai.py—generate_chat_completionWhen the upstream LLM returns HTTP 400, the function parses the JSON body and
returns a
JSONResponse(status_code=400, ...)instead of raising an exception:@tjbck commented on GitHub (Apr 13, 2026):
Should be addressed in dev.