mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-10 07:43:10 -05:00
context size limit indicator #3982
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @dromeuf on GitHub (Feb 19, 2025).
Feature Request
When using custom Model Chat, I sometimes have long conversations. How do I know if I've reached the context limit of the LLM I'm using? Is there an option or counter to find out where I am in the context window limit ? Does OpenWebUI offer a “native” counter that shows exactly how many tokens have been used and where you are in relation to the model's context limit ? Is there an official indicator or option for automatically displaying the number of tokens consumed on the interface?
at some point the quality of the conversation is affected when it's long.
Thanks for your great work. David.
@silentoplayz commented on GitHub (Feb 19, 2025):
Related - https://github.com/open-webui/open-webui/issues/573
@dromeuf commented on GitHub (Feb 20, 2025):
Hello silentoplayz, I'm sorry but I can't find a clear answer in the tickets quoted.
@silentoplayz commented on GitHub (Feb 21, 2025):
In short, there's not a way within Open WebUI to accurately count how many tokens have been used within a chat. There might be community made functions, but even those can be inaccurate if the model you're using doesn't use the tokenizer import of the function.