mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #19594] feat: Persistent "Invisible" Metadata Object within Chat History for Context Management / OR new DB Table for storing these #34465
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Classic298 on GitHub (Nov 29, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/19594
Check Existing Issues
Verify Feature Scope
Problem Description
I am trying to implement a "Context Compaction" feature similar to the update recently released by Anthropic for Claude.ai (where earlier context is summarized to prevent hitting token limits and enhance context management).
Currently, implementing this in Open WebUI via Filters is technically possible but architecturally flawed and unscalable for the following reasons:
The lack of a native place to store "hidden" state data inside the chat object prevents the creation of robust, self-contained, and scalable context management plugins.
Desired Solution you'd like
I request the addition of a persistent, "invisible" object/field within the standard Chat History structure (e.g., a metadata or hidden_context key inside the JSON object of the chat).
Key Requirements:
Proposed Workflow with this feature:
Benefit: This makes the chat self-contained. If the chat is deleted, the summary data is deleted with it. If the chat is exported, the intelligence travels with it. No external databases, no race conditions, no orphaned data. And the concurrency issue I described in the Problem Description section is also solved, since whatever request the user sent on the two separate devices got processed LAST will be the ultimate final state of whatever is saved in the chat.
Alternatives Considered
As detailed in the problem description, using a Filter and complex external setup was considered and rejected. It introduces a single point of failure, massive storage overhead, and violates the principle of keeping chat data atomic as well is impossible to implement keeping concurrent requests to the same chat in mind. It requires handling synchronization logic (ASIC/CAP theorem constraints) that should not be the responsibility of a simple context filter.
Additional Context
Anthropic's implementation of context compaction.
Technical Note on Performance: Currently, very long chats in Open WebUI can be slow to load. While adding this field adds data, it allows us to prevent the visible message history from growing infinitely, potentially speeding up rendering if the UI only loads the visible messages and keeps the hidden_context in the background.
However, this request assumes that the backend structure can handle slightly larger JSON objects per chat. A general refactor of how large chat objects are loaded might be required in tandem to ensure the UI remains snappy.
@Classic298 commented on GitHub (Dec 21, 2025):
Related: https://github.com/open-webui/open-webui/discussions/19279
@elacy commented on GitHub (Mar 7, 2026):
I have another use case for this feature. I'm trying to run RPGs through this and being able to store game state in the conversation message means if something goes wrong I can just delete the message and the game state is back to where it was at the last message.