[GH-ISSUE #22970] issue: No loading indicator between tool call completion and LLM response, making it appear as if the response has stopped #35385

Closed
opened 2026-04-25 09:35:56 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @ShirasawaSama on GitHub (Mar 24, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/22970

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

v0.8.10

Ollama Version (if applicable)

No response

Operating System

MacOS 26

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

After tool calls complete and before the LLM starts streaming, there should be a visible loading/processing indicator (e.g., a spinner, a pulsing dots animation, or a "Generating response..." status message) to clearly communicate to the user that the system is still working and waiting for the LLM response.

Actual Behavior

Once the tool call finishes, the UI renders the tool results (e.g., "Retrieved 9 sources" with citation icons) but then shows nothing — no spinner, no animation, no status text. The input box appears ready for new input. The user has no way to tell whether the system is still processing or has silently failed/stopped. This leads to confusion, especially when the wait time is long (15+ seconds).

Steps to Reproduce

  1. Enable web search (or any other tool).
  2. Use a model with relatively high latency for the first token (e.g., gemini-3.1-pro-preview).
  3. Send a query that triggers a web search (e.g., "today news").
  4. Observe: the web search completes, citations/sources are displayed — but then the UI goes completely idle with no indication that the LLM is still being called.

Logs & Screenshots

Image

There are no status updates here for 15 seconds.

Additional Information

No response

Originally created by @ShirasawaSama on GitHub (Mar 24, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/22970 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version v0.8.10 ### Ollama Version (if applicable) _No response_ ### Operating System MacOS 26 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior After tool calls complete and before the LLM starts streaming, there should be a visible loading/processing indicator (e.g., a spinner, a pulsing dots animation, or a "Generating response..." status message) to clearly communicate to the user that the system is still working and waiting for the LLM response. ### Actual Behavior Once the tool call finishes, the UI renders the tool results (e.g., "Retrieved 9 sources" with citation icons) but then shows nothing — no spinner, no animation, no status text. The input box appears ready for new input. The user has no way to tell whether the system is still processing or has silently failed/stopped. This leads to confusion, especially when the wait time is long (15+ seconds). ### Steps to Reproduce 1. Enable web search (or any other tool). 2. Use a model with relatively high latency for the first token (e.g., gemini-3.1-pro-preview). 3. Send a query that triggers a web search (e.g., "today news"). 4. Observe: the web search completes, citations/sources are displayed — but then the UI goes completely idle with no indication that the LLM is still being called. ### Logs & Screenshots <img width="4124" height="1752" alt="Image" src="https://github.com/user-attachments/assets/31258edf-e0fd-400d-9acf-051d3479759f" /> There are no status updates here for 15 seconds. ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-25 09:35:56 -05:00
Author
Owner

@ctcanbol commented on GitHub (Mar 24, 2026):

I'm also having problems because of this behavior, users sometimes assumes the conversation stuck. Tried to solve it using a filter, but still not covering all the scenarios that causes this.

<!-- gh-comment-id:4117238648 --> @ctcanbol commented on GitHub (Mar 24, 2026): I'm also having problems because of this behavior, users sometimes assumes the conversation stuck. Tried to solve it using a filter, but still not covering all the scenarios that causes this.
Author
Owner

@tjbck commented on GitHub (Apr 1, 2026):

Intended behaviour, open to discussion.

<!-- gh-comment-id:4169414788 --> @tjbck commented on GitHub (Apr 1, 2026): Intended behaviour, open to discussion.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#35385