Context Length setting is completely ineffective #1914

Closed
opened 2025-11-11 14:56:28 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @CookSleep on GitHub (Aug 28, 2024).

Bug Report

Installation Method

Docker

Environment

  • Open WebUI Version: v0.3.16
  • Operating System: Ubuntu 22.04
  • Browser: Edge 128.0.2739.42

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

After modifying the Context Length to a specific value in the account settings or conversation settings, the sent requests will follow the settings.

No matter where or how the Context Length is set, it seems that the setting does not take effect. The tokens consumed by the requests, the balance, and the model's ability to remember information far exceeding the Context Length limit all reflect this.

Actual Behavior:

Description

Bug Summary:
Context Length setting is completely ineffective

Reproduction Details

Steps to Reproduce:

  1. Modify Context Length to 128000 in account settings
  2. Modify Context Length to 50 in the Chat Controls of a new conversation
  3. Send the model "Please remember, 47867 represents a bug" (randomly written)
  4. After the model replies, send a new message that is far longer than 50 tokens (about 4000 tokens) and completely unrelated (part of the environment variable content from the Open WebUI documentation)
  5. Then ask the model "What does 47867 represent?" and the model still remembers "it represents a bug"

Logs

Browser Console Logs:
OpenWebUI-1724846765350.log

Additional Information

I have also found that many people on some popular forums are encountering this issue, and it is not just in these few versions. The Context Length are ineffective, causing people to spend far more on API costs than expected. I hope it can be fixed as soon as possible!

Originally created by @CookSleep on GitHub (Aug 28, 2024). # Bug Report ## Installation Method Docker ## Environment - **Open WebUI Version:** v0.3.16 - **Operating System:** Ubuntu 22.04 - **Browser:** Edge 128.0.2739.42 **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [ ] I have included the Docker container logs. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: After modifying the Context Length to a specific value in the account settings or conversation settings, the sent requests will follow the settings. No matter where or how the Context Length is set, it seems that the setting does not take effect. The tokens consumed by the requests, the balance, and the model's ability to remember information far exceeding the Context Length limit all reflect this. ## Actual Behavior: ## Description **Bug Summary:** Context Length setting is completely ineffective ## Reproduction Details **Steps to Reproduce:** 1. Modify Context Length to 128000 in account settings 2. Modify Context Length to 50 in the Chat Controls of a new conversation 3. Send the model "Please remember, 47867 represents a bug" (randomly written) 4. After the model replies, send a new message that is far longer than 50 tokens (about 4000 tokens) and completely unrelated (part of the environment variable content from the Open WebUI documentation) 5. Then ask the model "What does 47867 represent?" and the model still remembers "it represents a bug" ## Logs **Browser Console Logs:** [OpenWebUI-1724846765350.log](https://github.com/user-attachments/files/16783011/OpenWebUI-1724846765350.log) ## Additional Information I have also found that many people on some popular forums are encountering this issue, and it is not just in these few versions. The Context Length are ineffective, causing people to spend far more on API costs than expected. I hope it can be fixed as soon as possible!
Author
Owner

@CookSleep commented on GitHub (Aug 28, 2024):

Even if the Context Length is set to 50 tokens in the account settings, the problem still exists!

@CookSleep commented on GitHub (Aug 28, 2024): Even if the Context Length is set to 50 tokens in the account settings, the problem still exists!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1914