[GH-ISSUE #24182] issue: infinite load of big chats #58890

Closed
opened 2026-05-06 00:21:25 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @ghost on GitHub (Apr 27, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/24182

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Other

Open WebUI Version

latest

Ollama Version (if applicable)

latest

Operating System

any

Browser (if applicable)

firefox latest

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

load chats

Actual Behavior

large chats are never load even tho chunk with data appears in network tab (2-3 megabytes for example)

Image

Steps to Reproduce

have a really long chat
for example 20k tokens in each reply
20 requests

Logs & Screenshots

theres no errors in console

Additional Information

No response

Originally created by @ghost on GitHub (Apr 27, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/24182 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Other ### Open WebUI Version latest ### Ollama Version (if applicable) latest ### Operating System any ### Browser (if applicable) firefox latest ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior load chats ### Actual Behavior large chats are never load even tho chunk with data appears in network tab (2-3 megabytes for example) <img width="965" height="692" alt="Image" src="https://github.com/user-attachments/assets/5c5babc0-7bfc-4cce-8dc3-54cf6a62a6d9" /> ### Steps to Reproduce have a really long chat for example 20k tokens in each reply 20 requests ### Logs & Screenshots theres no errors in console ### Additional Information _No response_
GiteaMirror added the bug label 2026-05-06 00:21:25 -05:00
Author
Owner

@ghost commented on GitHub (Apr 27, 2026):

"text": "{\"id\":\"xxxxxxxxxx\",\"user_id\":\"yyyyyyyyyyyyyyyyy\",
\"title\":\"String Parser\",
\"chat\":{\"id\":\"xxxxxxxxxx\",
\"title\":\"String Parser\",
\"models\":[\"qwen-3.6-35b\"],
\"history\":{
\"messages\":{
\"nnnnnnnnnn\":{
\"id\":\"nnnnnnnnnn\",\"parentId\":null,
\"childrenIds\":[\"hhhhhhhhhhhhhhhh\"],
\"role\":\"user\",\"content\":\"i got function\\n  \\n\\n```\\ntext:abcd\\n.

just a part of that response

<!-- gh-comment-id:4330873450 --> @ghost commented on GitHub (Apr 27, 2026): ``` "text": "{\"id\":\"xxxxxxxxxx\",\"user_id\":\"yyyyyyyyyyyyyyyyy\", \"title\":\"String Parser\", \"chat\":{\"id\":\"xxxxxxxxxx\", \"title\":\"String Parser\", \"models\":[\"qwen-3.6-35b\"], \"history\":{ \"messages\":{ \"nnnnnnnnnn\":{ \"id\":\"nnnnnnnnnn\",\"parentId\":null, \"childrenIds\":[\"hhhhhhhhhhhhhhhh\"], \"role\":\"user\",\"content\":\"i got function\\n \\n\\n```\\ntext:abcd\\n. ``` just a part of that response
Author
Owner

@ghost commented on GitHub (Apr 27, 2026):

all i did was using chat, then closing browser, and reopening it

<!-- gh-comment-id:4330959425 --> @ghost commented on GitHub (Apr 27, 2026): all i did was using chat, then closing browser, and reopening it
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#58890