[GH-ISSUE #19154] issue: model streaming parameter set to True, even though the setting is set to False #57455

Closed
opened 2026-05-05 20:56:50 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @DirkRemmers on GitHub (Nov 13, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/19154

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

v0.6.36

Ollama Version (if applicable)

No response

Operating System

Windows 11

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

When I set a models Stream Chat Response parameter to Off in the Admin Panel, the request to the model should include the 'stream': False setting in the request. This works perfectly fine in version 0.6.34.

Actual Behavior

In version 0.6.36, this seems to be broken. When I set a models Stream Chat Response parameter to Off in the Admin Panel, the request to the model now includes include the 'stream': True setting in the request.

Steps to Reproduce

  1. Open the admin panel and select a model
  2. Set the Stream Chat Response parameter to Off
  3. Save the model settings
  4. Make a request to the model and observe the request

Test this both in versions 0.6.34 and 0.6.36 to see the difference. I did not test 0.6.35.

Logs & Screenshots

Both requests are made while the Stream Chat Response parameter is set to Off:

Request contents version 0.6.34:
request = {'stream': False, 'model': 'test-model', 'messages': [{'role': 'user', 'content': 'hi there'}]}

Request contents version 0.6.36:
request = {'stream': True, 'model': 'test-model', 'messages': [{'role': 'user', 'content': 'hi there'}]}

Additional Information

No response

Originally created by @DirkRemmers on GitHub (Nov 13, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/19154 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version v0.6.36 ### Ollama Version (if applicable) _No response_ ### Operating System Windows 11 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior When I set a models `Stream Chat Response` parameter to `Off` in the Admin Panel, the request to the model should include the `'stream': False` setting in the request. This works perfectly fine in version 0.6.34. ### Actual Behavior In version 0.6.36, this seems to be broken. When I set a models `Stream Chat Response` parameter to `Off` in the Admin Panel, the request to the model now includes include the `'stream': True` setting in the request. ### Steps to Reproduce 1. Open the admin panel and select a model 2. Set the `Stream Chat Response` parameter to `Off` 3. Save the model settings 4. Make a request to the model and observe the request Test this both in versions 0.6.34 and 0.6.36 to see the difference. I did not test 0.6.35. ### Logs & Screenshots Both requests are made while the `Stream Chat Response` parameter is set to `Off`: Request contents version 0.6.34: request = {'stream': False, 'model': 'test-model', 'messages': [{'role': 'user', 'content': 'hi there'}]} Request contents version 0.6.36: request = {'stream': True, 'model': 'test-model', 'messages': [{'role': 'user', 'content': 'hi there'}]} ### Additional Information _No response_
GiteaMirror added the bug label 2026-05-05 20:56:50 -05:00
Author
Owner

@silentoplayz commented on GitHub (Nov 13, 2025):

I believe this issue is valid, as toggling Stream Chat Response to Off at the model level for an external GroqCloud model isn't resulting in the model's responses NOT being streamed in chats on my end.

Edit: I believe these two screenshots validates the issue existing.
Image

Image
<!-- gh-comment-id:3529426247 --> @silentoplayz commented on GitHub (Nov 13, 2025): I believe this issue is valid, as toggling `Stream Chat Response` to `Off` at the model level for an external GroqCloud model isn't resulting in the model's responses **NOT** being streamed in chats on my end. Edit: I believe these two screenshots validates the issue existing. <img width="2560" height="894" alt="Image" src="https://github.com/user-attachments/assets/21c268c5-30a4-46ec-b778-1b11e89baa1b" /> <img width="2560" height="1280" alt="Image" src="https://github.com/user-attachments/assets/47ecb501-164b-42a1-b91d-d08fcc2d5a9f" />
Author
Owner

@DirkRemmers commented on GitHub (Nov 13, 2025):

Thanks for visualizing it!

<!-- gh-comment-id:3529855819 --> @DirkRemmers commented on GitHub (Nov 13, 2025): Thanks for visualizing it!
Author
Owner

@tjbck commented on GitHub (Nov 17, 2025):

Addressed in dev! f138be9d8a

<!-- gh-comment-id:3540536225 --> @tjbck commented on GitHub (Nov 17, 2025): Addressed in dev! f138be9d8a55f5bc5ec3c95e523eb4341e6ddeca
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#57455