[GH-ISSUE #21162] Regression: API stream parameter ignored after fix #19154 (v0.7.2) #58071

Closed
opened 2026-05-05 22:17:11 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @konradzamojski on GitHub (Feb 4, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/21162

Bug Report: API stream parameter still ignored after #19154 fix

🐛 Description

While issue #19154 fixed the problem where stream_response setting was not being applied in v0.6.36, it introduced a new regression: the stream parameter sent in API requests is now always overridden by the model's stream_response setting.

This breaks OpenAI API compatibility and makes it impossible to control streaming behavior per-request, which is critical for frameworks like LangChain.

📋 Environment

  • OpenWebUI Version: 0.7.2 (latest as of 2026-01-10)
  • Installation Method: pip (Python 3.12)
  • OS: Linux (Ubuntu/Debian)
  • Related Issues: #19154 (fixed but introduced regression)

🔍 Problem History

Issue #19154 (November 2025)

Problem: In v0.6.36, stream_response setting was not being applied
Fix: Commit f138be9 added code to apply model's stream_response
Result: Model setting is now applied, BUT API parameter is now ignored

Current Problem (v0.7.2+)

Problem: API stream parameter is always overridden by model setting
Impact: Cannot control streaming per-request via API
Breaks: OpenAI API compatibility, LangChain integration

🔧 Root Cause

Current code in backend/open_webui/main.py (after fix #19154):

# Model Params
if model_info_params.get("stream_response") is not None:
    form_data["stream"] = model_info_params.get("stream_response")

Problem: This code unconditionally overwrites the stream parameter from API requests.

🎯 Expected Behavior vs Actual

Test 1: API with stream: false

curl -X POST /api/v1/chat/completions \
  -d '{"model": "test", "stream": false, "messages": [...]}'

Expected: JSON response (not streamed) - API parameter should be respected
Actual: Response streaming (SSE) - API parameter ignored

Test 2: API with stream: true

curl -X POST /api/v1/chat/completions \
  -d '{"model": "test", "stream": true, "messages": [...]}'

Expected: SSE chunks (streamed)
Actual: Depends on model setting, not API parameter

Test 3: No stream parameter

curl -X POST /api/v1/chat/completions \
  -d '{"model": "test", "messages": [...]}'

Expected: Use model's default setting
Actual: Uses model setting (this works correctly)

Proposed Solution

The fix should respect both requirements:

  1. Apply model's stream_response setting (from #19154)
  2. Allow API parameter to override model setting (OpenAI compatibility)
# Model Params
# Use stream from API if present, otherwise use model default
if "stream" not in form_data and model_info_params.get("stream_response") is not None:
    form_data["stream"] = model_info_params.get("stream_response")

This solution:

  • Keeps #19154 fix working - model setting is applied when no API parameter
  • Fixes regression - API parameter has priority when present
  • Maintains OpenAI API compatibility
  • Enables LangChain integration
  • No breaking changes

🧪 Test Results

After applying the proposed fix:

Test Case API Param Model Setting Expected Result Actual Result
1 stream: false any No streaming No streaming
2 stream: true any Streaming Streaming
3 not provided false No streaming No streaming
4 not provided true Streaming Streaming

All tests passing!

💥 Impact

Who is affected:

  • Anyone using OpenWebUI as OpenAI-compatible API endpoint
  • LangChain users (cannot control streaming=True/False)
  • Custom applications using the API
  • Any client that needs per-request streaming control

Severity: High - Breaks API compatibility

  • #19154 - Original issue that introduced the regression
  • Commit f138be9 - The commit that needs refinement
  • Related to tool calling issues when streaming is disabled (#12135, #18121)

📝 Additional Notes

Why This Is Important

  1. OpenAI API Compatibility: OpenAI's API respects the stream parameter per-request
  2. LangChain Requirement: Frameworks need to control streaming dynamically
  3. Regression from Fix: The fix for #19154 solved one problem but created another
  4. Simple Solution: One-word change (if "stream" not in form_data and ...)

Backward Compatibility

The proposed fix is 100% backward compatible:

  • Existing behavior for requests without stream param: unchanged
  • Model settings still work as defaults: unchanged
  • Only change: API parameter is now respected (as per OpenAI spec)

🙋 Proposed Solution

I have:

  • Tested the fix thoroughly (all scenarios)
  • Verified no breaking changes
  • Confirmed OpenAI API compatibility
  • Ready to submit PR if needed

Would you like me to create a Pull Request with this fix?


Priority: High (API compatibility regression)
Difficulty: Trivial (one-line change)
Impact: High (affects all API users)
Regression from: #19154 (commit f138be9)

Originally created by @konradzamojski on GitHub (Feb 4, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/21162 # Bug Report: API stream parameter still ignored after #19154 fix ## 🐛 Description While issue #19154 fixed the problem where `stream_response` setting was not being applied in v0.6.36, **it introduced a new regression**: the `stream` parameter sent in API requests is now **always overridden** by the model's `stream_response` setting. This breaks OpenAI API compatibility and makes it impossible to control streaming behavior per-request, which is critical for frameworks like LangChain. ## 📋 Environment - **OpenWebUI Version:** 0.7.2 (latest as of 2026-01-10) - **Installation Method:** pip (Python 3.12) - **OS:** Linux (Ubuntu/Debian) - **Related Issues:** #19154 (fixed but introduced regression) ## 🔍 Problem History ### Issue #19154 (November 2025) **Problem:** In v0.6.36, `stream_response` setting was not being applied **Fix:** Commit `f138be9` added code to apply model's `stream_response` **Result:** ✅ Model setting is now applied, BUT ❌ API parameter is now ignored ### Current Problem (v0.7.2+) **Problem:** API `stream` parameter is **always overridden** by model setting **Impact:** Cannot control streaming per-request via API **Breaks:** OpenAI API compatibility, LangChain integration ## 🔧 Root Cause Current code in `backend/open_webui/main.py` (after fix #19154): ```python # Model Params if model_info_params.get("stream_response") is not None: form_data["stream"] = model_info_params.get("stream_response") ``` **Problem:** This code **unconditionally overwrites** the `stream` parameter from API requests. ## 🎯 Expected Behavior vs Actual ### Test 1: API with `stream: false` ```bash curl -X POST /api/v1/chat/completions \ -d '{"model": "test", "stream": false, "messages": [...]}' ``` **Expected:** JSON response (not streamed) - API parameter should be respected **Actual:** Response streaming (SSE) - API parameter ignored ❌ ### Test 2: API with `stream: true` ```bash curl -X POST /api/v1/chat/completions \ -d '{"model": "test", "stream": true, "messages": [...]}' ``` **Expected:** SSE chunks (streamed) **Actual:** Depends on model setting, not API parameter ❌ ### Test 3: No stream parameter ```bash curl -X POST /api/v1/chat/completions \ -d '{"model": "test", "messages": [...]}' ``` **Expected:** Use model's default setting ✅ **Actual:** Uses model setting ✅ (this works correctly) ## ✅ Proposed Solution The fix should respect **both** requirements: 1. Apply model's `stream_response` setting (from #19154) 2. Allow API parameter to override model setting (OpenAI compatibility) ### Recommended Fix: ```python # Model Params # Use stream from API if present, otherwise use model default if "stream" not in form_data and model_info_params.get("stream_response") is not None: form_data["stream"] = model_info_params.get("stream_response") ``` **This solution:** - ✅ **Keeps #19154 fix working** - model setting is applied when no API parameter - ✅ **Fixes regression** - API parameter has priority when present - ✅ **Maintains OpenAI API compatibility** - ✅ **Enables LangChain integration** - ✅ **No breaking changes** ## 🧪 Test Results After applying the proposed fix: | Test Case | API Param | Model Setting | Expected Result | Actual Result | |-----------|-----------|---------------|-----------------|---------------| | 1 | `stream: false` | any | No streaming | ✅ No streaming | | 2 | `stream: true` | any | Streaming | ✅ Streaming | | 3 | not provided | `false` | No streaming | ✅ No streaming | | 4 | not provided | `true` | Streaming | ✅ Streaming | All tests passing! ✅ ## 💥 Impact **Who is affected:** - Anyone using OpenWebUI as OpenAI-compatible API endpoint - LangChain users (cannot control `streaming=True/False`) - Custom applications using the API - Any client that needs per-request streaming control **Severity:** High - Breaks API compatibility ## 🔗 Related Issues & PRs - **#19154** - Original issue that introduced the regression - **Commit f138be9** - The commit that needs refinement - Related to tool calling issues when streaming is disabled (#12135, #18121) ## 📝 Additional Notes ### Why This Is Important 1. **OpenAI API Compatibility:** OpenAI's API respects the `stream` parameter per-request 2. **LangChain Requirement:** Frameworks need to control streaming dynamically 3. **Regression from Fix:** The fix for #19154 solved one problem but created another 4. **Simple Solution:** One-word change (`if "stream" not in form_data and ...`) ### Backward Compatibility The proposed fix is **100% backward compatible**: - Existing behavior for requests **without** `stream` param: unchanged ✅ - Model settings still work as defaults: unchanged ✅ - Only change: API parameter is now respected (as per OpenAI spec) ✅ ## 🙋 Proposed Solution I have: - ✅ Tested the fix thoroughly (all scenarios) - ✅ Verified no breaking changes - ✅ Confirmed OpenAI API compatibility - ✅ Ready to submit PR if needed Would you like me to create a Pull Request with this fix? --- **Priority:** High (API compatibility regression) **Difficulty:** Trivial (one-line change) **Impact:** High (affects all API users) **Regression from:** #19154 (commit f138be9)
GiteaMirror added the bug label 2026-05-05 22:17:11 -05:00
Author
Owner

@silentoplayz commented on GitHub (Feb 4, 2026):

A PR is welcome here!

<!-- gh-comment-id:3848039763 --> @silentoplayz commented on GitHub (Feb 4, 2026): A PR is welcome here!
Author
Owner

@konradzamojski commented on GitHub (Feb 5, 2026):

I've created a PR to fix this issue: #21175

The fix is simple and maintains backward compatibility while respecting the API stream parameter.

<!-- gh-comment-id:3852746367 --> @konradzamojski commented on GitHub (Feb 5, 2026): I've created a PR to fix this issue: #21175 The fix is simple and maintains backward compatibility while respecting the API stream parameter.
Author
Owner

@konradzamojski commented on GitHub (Feb 5, 2026):

Updated: Created PR #21176 (targeting dev branch with CLA)

The fix is ready for review!

<!-- gh-comment-id:3852754617 --> @konradzamojski commented on GitHub (Feb 5, 2026): Updated: Created PR #21176 (targeting dev branch with CLA) The fix is ready for review!
Author
Owner

@Classic298 commented on GitHub (Feb 5, 2026):

You didnt add the CLA (correctly)

<!-- gh-comment-id:3853331678 --> @Classic298 commented on GitHub (Feb 5, 2026): You didnt add the CLA (correctly)
Author
Owner

@tjbck commented on GitHub (Feb 5, 2026):

This is intended, you should set it to default for your use case.

<!-- gh-comment-id:3856050682 --> @tjbck commented on GitHub (Feb 5, 2026): This is intended, you should set it to default for your use case.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#58071