issue: context limit not applied from user settings #4386

Closed
opened 2025-11-11 15:52:56 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @talentoscope on GitHub (Mar 12, 2025).

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

0.5.20

Ollama Version (if applicable)

No response

Operating System

0.5.13

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

Changing the Context Length is the chat controls should propagate to ollama as num_ctx

Actual Behavior

Doesn't appear to, even with a new chat.
Ollama still shows

llama_init_from_model: n_ctx_per_seq (2048) < n_ctx_train (131072) -- the full capacity of the model will not be utilized

despite the setting being changed to 16000 or any other number.

Steps to Reproduce

See above

Logs & Screenshots

N/A

Additional Information

No response

Originally created by @talentoscope on GitHub (Mar 12, 2025). ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version 0.5.20 ### Ollama Version (if applicable) _No response_ ### Operating System 0.5.13 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior Changing the Context Length is the chat controls should propagate to ollama as num_ctx ### Actual Behavior Doesn't appear to, even with a new chat. Ollama still shows ``` llama_init_from_model: n_ctx_per_seq (2048) < n_ctx_train (131072) -- the full capacity of the model will not be utilized ``` despite the setting being changed to 16000 or any other number. ### Steps to Reproduce See above ### Logs & Screenshots N/A ### Additional Information _No response_
GiteaMirror added the bug label 2025-11-11 15:52:56 -06:00
Author
Owner

@talentoscope commented on GitHub (Mar 12, 2025):

This may be a regression of bug #8368

@talentoscope commented on GitHub (Mar 12, 2025): This may be a regression of bug #8368
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#4386