mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #23921] fix: Premature finish_reason 'stop' on first SSE chunk breaks API clients for Ollama reasoning models #35637
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @pvyswiss on GitHub (Apr 21, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/23921
Bug Description
API clients see the first SSE chunk then the stream appears to end. Ollama reasoning models (DeepSeek R1, Gemma 4) hang after "let me think." The Web UI works fine.
Related to: #23917 (Bug 2)
Root Cause
openai_chat_chunk_message_template()inmisc.pyuses truthy checks:When Ollama sends the first chunk for a reasoning model, both
contentandthinkingare empty strings (""). Empty strings are falsy in Python, sofinish_reason: "stop"is set on the very first chunk.API clients complying with the OpenAI spec close the stream on
finish_reason: "stop".Fix
Only set
finish_reason: "stop"whenusageis present (final chunk):Impact
finish_reason)File
backend/open_webui/utils/misc.pyReproduction