mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
[GH-ISSUE #13197] issue: Context lenght isn't working for normal users #55506
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @somera on GitHub (Apr 24, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/13197
Check Existing Issues
Installation Method
Docker
Open WebUI Version
0.6.6
Ollama Version (if applicable)
0.6.6
Operating System
Ubuntu 24.04.x
Browser (if applicable)
Chrome 135, Firefox 128.9
Confirmation
README.md.Expected Behavior
If I send a very long prompt, I expect prompt_tokens to be high too.
Actual Behavior
I found problem with the context length.
As admin user I'm sending long prompt:
And I see 17603 prompt tokens. When I set content length to 10000, than I see 10000 too. this works for admin in the own settings and chat settings.
But for the normal Open WebUI user it's not working. In this case I see only 2048 prompt tokens.
When I change it in my own settings, it's not working too.
Steps to Reproduce
Logs & Screenshots
Noting at the moment available.
Additional Information
No response
@tjbck commented on GitHub (Apr 25, 2025):
Context length does not mean what you described.
@somera commented on GitHub (Apr 25, 2025):
@tjbck how can the difference between the same prompt for the admin (17603) and the normal user (2048) be explained?
The 2048 for the normal user corresponds to the context_lenght, which I can set. If I increase this to 4096, then prompt_eval_count is not 4096.