mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 02:48:13 -05:00
[PR #23322] [CLOSED] fix: normalize usage keys on save in ChatMessages.upsert_message #27147
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/open-webui/open-webui/pull/23322
Author: @smorello87
Created: 4/1/2026
Status: ❌ Closed
Base:
dev← Head:fix/normalize-usage-on-save📝 Commits (10+)
fe6783cMerge pull request #19030 from open-webui/devfc05e0aMerge pull request #19405 from open-webui/deve3faec6Merge pull request #19416 from open-webui/dev9899293Merge pull request #19448 from open-webui/dev140605eMerge pull request #19462 from open-webui/dev6f1486fMerge pull request #19466 from open-webui/devd95f533Merge pull request #19729 from open-webui/deva7271530.6.43 (#20093)6adde20Merge pull request #20394 from open-webui/devf9b0534Merge pull request #20522 from open-webui/dev📊 Changes
1 file changed (+4 additions, -0 deletions)
View changed files
📝
backend/open_webui/models/chat_messages.py(+4 -0)📄 Description
Pull Request Checklist
devbranch.normalize_usage()fromutils/response.py.fix:Changelog Entry
Description
OpenAI-compatible APIs (OpenRouter, etc.) return token usage as
prompt_tokens/completion_tokens, but the analytics queries inget_token_usage_by_modelandget_token_usage_by_userreadinput_tokens/output_tokens. The existingnormalize_usage()function inutils/response.pyhandles this mapping but is only called in the streaming middleware — the normalized result never reaches the database save path throughChatMessages.upsert_message().This adds
normalize_usage()calls in both the update and insert branches ofupsert_message(), ensuring all usage data is normalized before saving regardless of code path.Added
Changed
Deprecated
Removed
Fixed
chat_message.usagewas saved with OpenAI-format keys (prompt_tokens/completion_tokens) but analytics queries read Anthropic-format keys (input_tokens/output_tokens). Now both key formats are always present.Security
Breaking Changes
Additional Information
How it was tested
chat_messagetable directly to confirm bothprompt_tokensandinput_tokenskeys are present in saved usage JSONScreenshots or Videos
Before fix — analytics shows 0 tokens (keys missing):
After fix — both key formats present:
Contributor License Agreement
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.