mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-05 18:38:17 -05:00
[GH-ISSUE #22177] bug: Streaming tool call function name doubled by delta accumulation (GPT-5/5.1) #58315
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @madnight on GitHub (Mar 3, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/22177
Check Existing Issues
Installation Method
Docker (Cloud Run)
Open WebUI Version
v0.8.5
Ollama Version (if applicable)
No response
Operating System
Linux (GCP Cloud Run)
Browser (if applicable)
Chrome
Confirmation
README.md.Expected Behavior
When using native function calling with streaming enabled, the tool call function name should be captured correctly from the streaming deltas, and the tool should execute successfully.
For example, an MCP tool registered as
my_server_searchshould be called asmy_server_search.Actual Behavior
The function name is doubled during streaming delta accumulation. For example,
my_server_searchbecomesmy_server_searchmy_server_search. This causes a silent failure because the doubled name does not match any key intools_dict, sotool_function_name in toolsevaluates toFalseand the tool is never executed.The user sees "Tool Executed" in the UI but the tool was never actually called — a silent failure with no error message.
Root Cause
In
backend/open_webui/utils/middleware.py, the streaming tool call delta handler accumulates the function name with+=:The first delta for a tool call index creates the entry with
name = "my_server_search"(viaresponse_tool_calls.append(delta_tool_call)at line ~3704). When a subsequent delta for the same index also includesfunction.name = "my_server_search", the+=operator concatenates it:"my_server_search" + "my_server_search"="my_server_searchmy_server_search".This is model-dependent. Some API providers (notably GPT-5 and GPT-5.1 via Azure) redundantly include the function name in follow-up deltas, while others (Claude, GPT-4o) only send it in the first delta.
Arguments are correctly accumulated with
+=because they are genuinely split across deltas. But function names are always sent complete in the first delta per the OpenAI streaming spec — they should not be accumulated.Steps to Reproduce
function.namein follow-up delta chunks (GPT-5 or GPT-5.1 via Azure OpenAI)my_server_searchmy_server_searchinstead ofmy_server_search)The bug does not reproduce with Claude models or GPT-4o because their APIs do not re-send the function name in subsequent deltas.
Data from Production Database
Query across 18 chats containing MCP tool calls shows a clear model-specific pattern:
gpt-5.1(2 chats)<name><name>gpt-5(4 chats)gpt-4o(2 chats)<name>claude-*(10 chats)<name>Proposed Fix
Change the name accumulation to a conditional assignment — only set the name if it has not been set yet:
This is safe because:
append(), so the conditional does not overwrite itsetdefault("name", "")at entry creation sets it to"", which is falsy, so the subsequent delta correctly sets itRelated Issues
52ccab8)Logs & Screenshots
No response
Additional Information
This bug affects any tool (MCP or otherwise) when used with native function calling and a model whose streaming API includes
function.namein multiple delta chunks. It is not specific to MCP or any particular tool — it is in the generic streaming tool call accumulation logic.@Classic298 commented on GitHub (Mar 3, 2026):
is this reproducible on latest? i cant reproduce with gpt 5.2
@madnight commented on GitHub (Mar 3, 2026):
@Classic298 I currently do not have access to GPT 5.2 for testing, but I am able to consistently reproduce the issue with GPT 5.1 from Azure (via LiteLLM) and the Atlassian MCP Server (Rovo), which is configured using OAuth 2.1 and DCR (https://mcp.atlassian.com/v1/mcp). Although the Issue is independent of the concrete MCP Tool Server.
I'm also not sure why @tjbck said in https://github.com/open-webui/open-webui/issues/16138 that double naming is "Intended behaviour for external tool servers". The problem is that, even if this were intended behavior, the tool calling simply does not work in this case, so it is not just a naming issue.
@madnight commented on GitHub (Mar 3, 2026):
I just noticed that the Code:
Under certain conditions, can also generate quadruple names.
I think the doubling pattern is like so:
Whereas longer names are increasingly rare.
@theepicsaxguy commented on GitHub (Mar 4, 2026):
I also experience this with 5.2/5.2-codex and 5.3-codex.
Models from other providers works fine. but OpenAI models specifically seems to have this issue.
@theepicsaxguy commented on GitHub (Mar 4, 2026):
A temporary Filter workaround I managed to get working with GPT 5.3-codex. @madnight
@Classic298 commented on GitHub (Mar 4, 2026):
hey guys
can you all please test this?
https://github.com/open-webui/open-webui/pull/22235
and also test for regressions, but there should be none.
I also researched about how providers handle these things and it seems like the OpenAI spec demands the function call to be sent AS ONE even if longer - so the function call is ONE delta. Any provider not doing that correctly is violating the spec, but all are doing it correctly as far as i can tell
so yeah please go ahead and test it @theepicsaxguy @madnight
we need confirmation here and also ideally tested on multiple models and providers
thanks!
@Classic298 commented on GitHub (Mar 6, 2026):
i can reproduce this now also on gpt 5.4
@Classic298 commented on GitHub (Mar 6, 2026):
Tested my PR in production. Works perfectly.
@Classic298 commented on GitHub (Mar 6, 2026):
observed new and very weird behaviour with gpt-5.4 via api
i have expanded the PR to catch this weird behaviour - works so far.
@Classic298 commented on GitHub (Mar 8, 2026):
should be fixed by
d7efdcce2bTesting wanted
@Classic298 commented on GitHub (Mar 8, 2026):
459a60a242