[GH-ISSUE #23181] issue: Switching conversations while message queue is processing causes messages to be sent in bulk and race condition issues #35440

Closed
opened 2026-04-25 09:39:03 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @ShirasawaSama on GitHub (Mar 28, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/23181

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

v0.8.12

Ollama Version (if applicable)

No response

Operating System

Mac

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

  • Switching away from a conversation should preserve the message queue state as-is. The queue should be paused or continue processing in the background tied to the original conversation.
  • Switching back to the original conversation should show the queue still intact with remaining messages, and resume sequential processing from where it left off.
  • "Send Now" should only send the current pending message and resume normal sequential processing — never triggering a bulk send of all remaining messages.
  • The queue state should be per-conversation and fully isolated from navigation, so switching conversations never corrupts or disrupts the queue.

Actual Behavior

When the user switches away from the conversation and switches back, the queue processing loop loses its proper state. Instead of resuming sequential processing, it dumps all remaining messages at once. The queue state is effectively corrupted by the conversation switch, causing race conditions between the old and new execution contexts.

Steps to Reproduce

Issue 1: Switching conversations causes bulk send

  1. Add multiple messages to the message queue (e.g., 5+ messages).
  2. Start processing the queue — wait for a few messages to be sent sequentially.
  3. While the queue is still processing, switch to a different conversation in the sidebar.
  4. Switch back to the original conversation.
  5. Observe that all remaining queued messages have been fired off at once instead of continuing to be sent one by one.

Issue 2: "Send Now" triggers remaining messages

  1. Add multiple messages to the message queue.
  2. Start the queue processing.
  3. Click the "Send Now" button to immediately send the current pending message.
  4. Observe that occasionally all remaining queued messages are also sent simultaneously.

Logs & Screenshots

N/A

Additional Information

#23180

Originally created by @ShirasawaSama on GitHub (Mar 28, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/23181 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version v0.8.12 ### Ollama Version (if applicable) _No response_ ### Operating System Mac ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior - **Switching away** from a conversation should **preserve the message queue state** as-is. The queue should be paused or continue processing in the background tied to the original conversation. - **Switching back** to the original conversation should show the queue still intact with remaining messages, and resume sequential processing from where it left off. - **"Send Now"** should only send the current pending message and resume normal sequential processing — never triggering a bulk send of all remaining messages. - The queue state should be **per-conversation** and fully isolated from navigation, so switching conversations never corrupts or disrupts the queue. ### Actual Behavior When the user switches away from the conversation and switches back, the queue processing loop loses its proper state. Instead of resuming sequential processing, it dumps all remaining messages at once. The queue state is effectively corrupted by the conversation switch, causing race conditions between the old and new execution contexts. ### Steps to Reproduce ### Issue 1: Switching conversations causes bulk send 1. Add multiple messages to the message queue (e.g., 5+ messages). 2. Start processing the queue — wait for a few messages to be sent sequentially. 3. While the queue is still processing, **switch to a different conversation** in the sidebar. 4. Switch back to the original conversation. 5. Observe that all remaining queued messages have been fired off at once instead of continuing to be sent one by one. ### Issue 2: "Send Now" triggers remaining messages 1. Add multiple messages to the message queue. 2. Start the queue processing. 3. Click the "Send Now" button to immediately send the current pending message. 4. Observe that occasionally all remaining queued messages are also sent simultaneously. ### Logs & Screenshots N/A ### Additional Information #23180
GiteaMirror added the bug label 2026-04-25 09:39:04 -05:00
Author
Owner

@TomTheWise commented on GitHub (Apr 3, 2026):

I dont know if this is exactly the behavior or similar or an completely different thing:

When sending a prompt with code execution (pyodide based) activated and then switching browser tabs (switching conversation) before the thinking process and code execution / code interpretation is finished - it will never finish but OWUI will stuck in the circling / queuning. The code will never be run inside pyodide.
So its currently absolutlely neccessary to stay in the tab and in the conversation and wait for the AI to compeltely finish thinking, code executing and fully responding.

Im unsure if this is another kind of the issue you described (switchign away from the conversation) or if its a complete seperate thing or maybe a bug/general limitation of the pyodide implementation?

<!-- gh-comment-id:4182717218 --> @TomTheWise commented on GitHub (Apr 3, 2026): I dont know if this is exactly the behavior or similar or an completely different thing: When sending a prompt with code execution (pyodide based) activated and then switching browser tabs (switching conversation) before the thinking process and code execution / code interpretation is finished - it will never finish but OWUI will stuck in the circling / queuning. The code will never be run inside pyodide. So its currently absolutlely neccessary to stay in the tab and in the conversation and wait for the AI to compeltely finish thinking, code executing and fully responding. Im unsure if this is another kind of the issue you described (switchign away from the conversation) or if its a complete seperate thing or maybe a bug/general limitation of the pyodide implementation?
Author
Owner

@Classic298 commented on GitHub (Apr 3, 2026):

@ShirasawaSama could you confirm this is fixed in dev?

<!-- gh-comment-id:4182719819 --> @Classic298 commented on GitHub (Apr 3, 2026): @ShirasawaSama could you confirm this is fixed in dev?
Author
Owner

@TomTheWise commented on GitHub (Apr 3, 2026):

Just pulled latest dev and the issue I described is the same - do you think its the exact same thing as this issue or unrelated (maybe only with code execution)?
If it is highly likely to be unrelated I can create a new issue and post a video.

Just switching conversations (new chat or another older conversation within the same OWUI tab without manually refreshing) while a response with code execution is being generated breaks it too. Always reproducable. So leaving the borwser tab is not neccessary.
I guess this fits this issue as it fits the expected behavior and actual behavior?

<!-- gh-comment-id:4182743757 --> @TomTheWise commented on GitHub (Apr 3, 2026): Just pulled latest dev and the issue I described is the same - do you think its the exact same thing as this issue or unrelated (maybe only with code execution)? If it is highly likely to be unrelated I can create a new issue and post a video. Just switching conversations (new chat or another older conversation within the same OWUI tab without manually refreshing) while a response with code execution is being generated breaks it too. Always reproducable. So leaving the borwser tab is not neccessary. I guess this fits this issue as it fits the expected behavior and actual behavior?
Author
Owner

@TomTheWise commented on GitHub (Apr 3, 2026):

https://github.com/user-attachments/assets/cc8d5ffc-eb71-4ba9-b877-9aab6112be7e
Had to shorten the video quite a lot and reduce quality to get it to upload here.

<!-- gh-comment-id:4184708240 --> @TomTheWise commented on GitHub (Apr 3, 2026): https://github.com/user-attachments/assets/cc8d5ffc-eb71-4ba9-b877-9aab6112be7e Had to shorten the video quite a lot and reduce quality to get it to upload here.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#35440