[GH-ISSUE #13197] issue: Context lenght isn't working for normal users #55506

Closed
opened 2026-05-05 17:36:34 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @somera on GitHub (Apr 24, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/13197

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

0.6.6

Ollama Version (if applicable)

0.6.6

Operating System

Ubuntu 24.04.x

Browser (if applicable)

Chrome 135, Firefox 128.9

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

If I send a very long prompt, I expect prompt_tokens to be high too.

Actual Behavior

I found problem with the context length.

As admin user I'm sending long prompt:

Image

And I see 17603 prompt tokens. When I set content length to 10000, than I see 10000 too. this works for admin in the own settings and chat settings.

But for the normal Open WebUI user it's not working. In this case I see only 2048 prompt tokens.

Image

When I change it in my own settings, it's not working too.

Steps to Reproduce

  1. Do the same like above for admin and normal user. You need a longs prompt.

Logs & Screenshots

Noting at the moment available.

Additional Information

No response

Originally created by @somera on GitHub (Apr 24, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/13197 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version 0.6.6 ### Ollama Version (if applicable) 0.6.6 ### Operating System Ubuntu 24.04.x ### Browser (if applicable) Chrome 135, Firefox 128.9 ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior If I send a very long prompt, I expect prompt_tokens to be high too. ### Actual Behavior I found problem with the context length. As admin user I'm sending long prompt: ![Image](https://github.com/user-attachments/assets/8224132b-4e15-4398-96a4-323a6250fff1) And I see 17603 prompt tokens. When I set content length to 10000, than I see 10000 too. this works for admin in the own settings and chat settings. But for the normal Open WebUI user it's not working. In this case I see only 2048 prompt tokens. ![Image](https://github.com/user-attachments/assets/45d35bea-6e9e-4a68-8e56-ca7fbe504fd4) When I change it in my own settings, it's not working too. ### Steps to Reproduce 1. Do the same like above for admin and normal user. You need a longs prompt. ### Logs & Screenshots Noting at the moment available. ### Additional Information _No response_
GiteaMirror added the bug label 2026-05-05 17:36:34 -05:00
Author
Owner

@tjbck commented on GitHub (Apr 25, 2025):

Context length does not mean what you described.

<!-- gh-comment-id:2829537825 --> @tjbck commented on GitHub (Apr 25, 2025): Context length does not mean what you described.
Author
Owner

@somera commented on GitHub (Apr 25, 2025):

@tjbck how can the difference between the same prompt for the admin (17603) and the normal user (2048) be explained?

The 2048 for the normal user corresponds to the context_lenght, which I can set. If I increase this to 4096, then prompt_eval_count is not 4096.

<!-- gh-comment-id:2829842743 --> @somera commented on GitHub (Apr 25, 2025): @tjbck how can the difference between the same prompt for the admin (17603) and the normal user (2048) be explained? The 2048 for the normal user corresponds to the context_lenght, which I can set. If I increase this to 4096, then prompt_eval_count is not 4096.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#55506