Knowledge Base query block streaming output of other users #3296

Closed
opened 2025-11-11 15:28:23 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @danielaskdd on GitHub (Jan 16, 2025).

Bug Report

Installation Method

Docker

Environment

  • Open WebUI Version: 0.5.4
  • Operating System: Ubuntu 24.04.1

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

One user's operation does not block another's operation too long.

Actual Behavior:

When one user send a query to Knowledge Base, it will block other users streaming output immediately. Other users's streaming resume until the Knowledge Base search is complete and the query is sent to LLM.

Reproduction Details

  1. User A is getting respond streaming from LLM query.
  2. User B send a query to Knowledge Base.
  3. User A's browser out is freeze immediately
  4. User B's Knowledge Base search is complete, waiting for the respond from LLM
  5. User A's output resume
Originally created by @danielaskdd on GitHub (Jan 16, 2025). # Bug Report ## Installation Method Docker ## Environment - **Open WebUI Version:** 0.5.4 - **Operating System:** Ubuntu 24.04.1 **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. - [ ] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: One user's operation does not block another's operation too long. ## Actual Behavior: When one user send a query to Knowledge Base, it will block other users streaming output immediately. Other users's streaming resume until the Knowledge Base search is complete and the query is sent to LLM. ## Reproduction Details 1. User A is getting respond streaming from LLM query. 2. User B send a query to Knowledge Base. 3. User A's browser out is freeze immediately 4. User B's Knowledge Base search is complete, waiting for the respond from LLM 5. User A's output resume
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#3296