mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-05 18:38:17 -05:00
[GH-ISSUE #23176] issue: Placeholder LLM message should be created with a pending/incomplete status to handle interrupted generation gracefully #19909
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @ShirasawaSama on GitHub (Mar 28, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/23176
Check Existing Issues
Installation Method
Git Clone
Open WebUI Version
v0.8.12
Ollama Version (if applicable)
No response
Operating System
Mac
Browser (if applicable)
No response
Confirmation
README.md.Expected Behavior
The placeholder message should be created with a pending/incomplete status (e.g.,
status: "pending"ordone: false). This way:Actual Behavior
The placeholder assistant message is saved to the database without any pending/incomplete status marker. When the /completions request fails or never reaches the backend (due to page refresh, network drop, or conversation switch), the backend is completely unaware of the failure.
Upon revisiting the conversation, the empty placeholder message is rendered as if the assistant intentionally returned an empty response — no error state, no loading indicator, no retry option, and no way for the user to understand what went wrong or recover from it.
Steps to Reproduce
Alternative reproduction:
/completionsrequest via DevTools) after the placeholder message is created but before the LLM response streams back, then refresh the page.Logs & Screenshots
Additional Information
This issue becomes even more problematic when starting a new conversation by sending the first message. In this scenario:
POST /api/v1/chats/newto create a new chat — this succeeds.POST /api/v1/chats/:idto create the placeholder assistant message, but the user refreshes the page before this call completes./completionsrequest is also never sent.The result is a conversation containing only the user's message with no assistant response whatsoever — not even an empty placeholder. The user sees a dead-end conversation with just their own message, no loading state, no error, and no indication that the assistant was supposed to respond. There is no retry button or any way to recover other than manually resending the message or deleting the conversation entirely.
@tjbck commented on GitHub (Apr 14, 2026):
Addressed in dev.