High network usage was observed only during page interactions. #2828

Closed
opened 2025-11-11 15:15:15 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @GrayXu on GitHub (Nov 26, 2024).

Bug Report

Environment

  • Open WebUI Version:
    v0.4.4 docker
  • Browser (if applicable): [e.g., Chrome 100.0, Firefox 98.0]
    chrome 131.0.6778.86
    Confirmation:
  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

When switching between different conversations on the page, since the conversations themselves only consist of a small amount of text (the rendering issue has been resolved previously), switching should be quick. Especially in conversations involving PDF research papers less than 4MB (using the entire document as context).

Actual Behavior:

Significant excessive network traffic occurs, causing the loading interface to be blocked for a long time on lightweight cloud servers with limited network bandwidth (for example, on my server with a restriction of 8mbps, it takes over 20 seconds to load).

Description

Bug Summary:
In theory, only a small amount of conversation text needs to be loaded, even considering the context of the PDF file, which is beyond expectation. Especially when the context presented in the conversation is plain text rather than the original PDF file. Moreover, this file can be lazy-loaded. Additionally, similar excessive network usage occurs when loading older conversation lists.

Originally created by @GrayXu on GitHub (Nov 26, 2024). # Bug Report ## Environment - **Open WebUI Version:** v0.4.4 docker - **Browser (if applicable):** [e.g., Chrome 100.0, Firefox 98.0] chrome 131.0.6778.86 **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: When switching between different conversations on the page, since the conversations themselves only consist of a small amount of text (the rendering issue has been resolved previously), switching should be quick. Especially in conversations involving PDF research papers less than 4MB (using the entire document as context). ## Actual Behavior: Significant excessive network traffic occurs, causing the loading interface to be blocked for a long time on lightweight cloud servers with limited network bandwidth (for example, on my server with a restriction of 8mbps, it takes over 20 seconds to load). ## Description **Bug Summary:** In theory, only a small amount of conversation text needs to be loaded, even considering the context of the PDF file, which is beyond expectation. Especially when the context presented in the conversation is plain text rather than the original PDF file. Moreover, this file can be lazy-loaded. Additionally, similar excessive network usage occurs when loading older conversation lists.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#2828