mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
[GH-ISSUE #6986] "Fluidly stream large external response chunks" Breaks Token Usage Display in Statistical Chat #53225
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @fl0w1nd on GitHub (Nov 16, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/6986
Bug Report
Installation Method
Docker
Environment
Open WebUI Version: v0.3.35
Operating System: MacOS Sequoia
Browser (if applicable): Chrome/130.0.0.0
Expected Behavior:
When the option "Fluidly stream large external response chunks" is enabled in
Settings->interface->UIconfiguration, the actual Token usage for statistical chat can still be displayed normally.Actual Behavior:
Description
When the option "Fluidly stream large external response chunks" is enabled, the "generation info" in the dialogue messages will disappear, while it will display normally when it is turned off.
Logs and Screenshots
"Fluidly stream large external response chunks" on:
"Fluidly stream large external response chunks" off:
Additional Information
In addition, I think the option "Fluidly stream large external response chunks" can be further optimized to make the streaming typewriter effect smoother.
@tjbck commented on GitHub (Jan 30, 2025):
Deprecated.