mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #11958] issue: Context Length completely ignored #31945
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @frenzybiscuit on GitHub (Mar 22, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/11958
Check Existing Issues
Installation Method
Pip Install
Open WebUI Version
latest
Ollama Version (if applicable)
No response
Operating System
Debian 12
Browser (if applicable)
Firefox
Confirmation
README.md.Expected Behavior
When using TabbyAPI as a OpenAI compatible backend, context length limit is ignored in the admin model settings.
I have confirmed with other people using open-webui that this happens.
*THIS IS NOT MARKED AS AN OLLAMA ONLY FEATURE
Actual Behavior
Set context length to 50. Send chat. Watch as tokens go over 50 and continue.
Tabby log:
2025-03-22 11:29:37.899 INFO: Finished chat completion streaming request 52c1a4f4badd475c8067daab9971af49
2025-03-22 11:29:37.900 INFO: Metrics (ID: 52c1a4f4badd475c8067daab9971af49): 89 tokens generated in 3.7 seconds (Queue: 0.0 s, Process: 95 cached tokens and 17 new tokens at 109.06 T/s, Generate: 25.09 T/s, Context: 112 tokens)
Steps to Reproduce
Read the above
Logs & Screenshots
No logs
Additional Information
No response
@tjbck commented on GitHub (Mar 23, 2025):
e5b7188379Marked as Ollama Only in dev.