context size limit indicator #3982

Closed
opened 2025-11-11 15:43:46 -06:00 by GiteaMirror · 3 comments
Owner

Originally created by @dromeuf on GitHub (Feb 19, 2025).

Feature Request

When using custom Model Chat, I sometimes have long conversations. How do I know if I've reached the context limit of the LLM I'm using? Is there an option or counter to find out where I am in the context window limit ? Does OpenWebUI offer a “native” counter that shows exactly how many tokens have been used and where you are in relation to the model's context limit ? Is there an official indicator or option for automatically displaying the number of tokens consumed on the interface?
at some point the quality of the conversation is affected when it's long.

Thanks for your great work. David.

Originally created by @dromeuf on GitHub (Feb 19, 2025). # Feature Request When using custom Model Chat, I sometimes have long conversations. How do I know if I've reached the context limit of the LLM I'm using? Is there an option or counter to find out where I am in the context window limit ? Does OpenWebUI offer a “native” counter that shows exactly how many tokens have been used and where you are in relation to the model's context limit ? Is there an official indicator or option for automatically displaying the number of tokens consumed on the interface? at some point the quality of the conversation is affected when it's long. Thanks for your great work. David.
Author
Owner

@silentoplayz commented on GitHub (Feb 19, 2025):

Related - https://github.com/open-webui/open-webui/issues/573

@silentoplayz commented on GitHub (Feb 19, 2025): Related - https://github.com/open-webui/open-webui/issues/573
Author
Owner

@dromeuf commented on GitHub (Feb 20, 2025):

En relation - #573

Hello silentoplayz, I'm sorry but I can't find a clear answer in the tickets quoted.

@dromeuf commented on GitHub (Feb 20, 2025): > En relation - [#573](https://github.com/open-webui/open-webui/issues/573) Hello silentoplayz, I'm sorry but I can't find a clear answer in the tickets quoted.
Author
Owner

@silentoplayz commented on GitHub (Feb 21, 2025):

En relation - #573

Hello silentoplayz, I'm sorry but I can't find a clear answer in the tickets quoted.

In short, there's not a way within Open WebUI to accurately count how many tokens have been used within a chat. There might be community made functions, but even those can be inaccurate if the model you're using doesn't use the tokenizer import of the function.

@silentoplayz commented on GitHub (Feb 21, 2025): > > En relation - [#573](https://github.com/open-webui/open-webui/issues/573) > > Hello silentoplayz, I'm sorry but I can't find a clear answer in the tickets quoted. In short, there's not a way within Open WebUI to accurately count how many tokens have been used within a chat. There might be community made functions, but even those can be inaccurate if the model you're using doesn't use the tokenizer import of the function.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#3982