[GH-ISSUE #22565] issue: Message queue should not block new messages while background tasks are still running #35279

Closed
opened 2026-04-25 09:30:26 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @ShirasawaSama on GitHub (Mar 11, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/22565

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

v0.8.10

Ollama Version (if applicable)

No response

Operating System

MacOS

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

When the LLM finishes generating its main response, the user should be able to immediately send a new message, even if background tasks (such as follow-up suggestion generation and title generation) are still in progress. These tasks are auxiliary and should not block the main conversation flow. The new message should be sent directly to the LLM without delay.

Actual Behavior

When the message queue feature is enabled, after the LLM completes its main response, if background tasks (e.g., follow-up suggestions, title generation) are still running, sending a new message causes it to be enqueued into the message queue instead of being sent directly. This blocks the user from continuing the conversation until all background tasks finish.

Steps to Reproduce

  1. Enable the message queue feature in Open WebUI settings.
  2. Enable background tasks such as follow-up question suggestions and/or automatic title generation.
  3. Start a new chat and send a message to the LLM.
  4. Wait for the LLM to fully complete its main response (streaming finished).
  5. While the follow-up suggestions or title generation tasks are still visibly loading/processing, immediately try to send a new message.
  6. Observe: The new message enters the message queue and is not sent until the background tasks complete, instead of being sent directly.

Logs & Screenshots

Image

Additional Information

No response

Originally created by @ShirasawaSama on GitHub (Mar 11, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/22565 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version v0.8.10 ### Ollama Version (if applicable) _No response_ ### Operating System MacOS ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior When the LLM finishes generating its main response, the user should be able to immediately send a new message, even if background tasks (such as follow-up suggestion generation and title generation) are still in progress. These tasks are auxiliary and should not block the main conversation flow. The new message should be sent directly to the LLM without delay. ### Actual Behavior When the message queue feature is enabled, after the LLM completes its main response, if background tasks (e.g., follow-up suggestions, title generation) are still running, sending a new message causes it to be enqueued into the message queue instead of being sent directly. This blocks the user from continuing the conversation until all background tasks finish. ### Steps to Reproduce 1. Enable the **message queue** feature in Open WebUI settings. 2. Enable background tasks such as **follow-up question suggestions** and/or **automatic title generation**. 3. Start a new chat and send a message to the LLM. 4. Wait for the LLM to **fully complete** its main response (streaming finished). 5. While the follow-up suggestions or title generation tasks are still visibly loading/processing, immediately try to send a **new message**. 6. **Observe:** The new message enters the message queue and is not sent until the background tasks complete, instead of being sent directly. ### Logs & Screenshots <img width="2498" height="834" alt="Image" src="https://github.com/user-attachments/assets/8bbca1b9-0a26-4f38-b9f7-3aa7c89cc0c5" /> ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-25 09:30:26 -05:00
Author
Owner

@tjbck commented on GitHub (Mar 15, 2026):

Addressed in dev.

<!-- gh-comment-id:4064206337 --> @tjbck commented on GitHub (Mar 15, 2026): Addressed in dev.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#35279