mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
[GH-ISSUE #24294] issue: Analytics token count stays at 0 for streaming responses, but works with non-streaming (LiteLLM + Azure OpenAI | Completions API) #58921
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @DediCATeD88 on GitHub (May 1, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/24294
Check Existing Issues
Installation Method
Docker
Open WebUI Version
v0.9.2
Ollama Version (if applicable)
No response
Operating System
Debian 12 (Docker host)
Browser (if applicable)
Chrome
Confirmation
README.md.Expected Behavior
When using streaming responses, token usage should still be persisted and shown correctly in Admin Analytics per user, just like it is when streaming is disabled.
Actual Behavior
With streaming enabled, Admin Analytics shows the correct message count per user, but token count stays at 0.
With streaming disabled, token counting works immediately and correctly.
This happens reproducibly in my setup:
Open WebUI -> Completions API -> LiteLLM -> Azure OpenAI.
I also tested LiteLLM with
always_include_stream_usage: true, but this did not change the behavior.Steps to Reproduce
v0.9.2Logs & Screenshots
No frontend error is visible to the user.
Additional Information
No response
@Classic298 commented on GitHub (May 4, 2026):
likely fixed by
989d5fd4e2anda32d26e61dTesting wanted on dev