mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #22251] issue: .split error causes no responses in chat, stop sequences and possibly MCP servers involved (v0.8.8) #19673
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @reversewave on GitHub (Mar 5, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/22251
Check Existing Issues
Installation Method
Docker
Open WebUI Version
v0.8.8
Ollama Version (if applicable)
N/A
Operating System
Windows 11 Pro
Browser (if applicable)
Vivaldi 7.8.3925.76 (Official Build) (64-bit)
Confirmation
README.md.Expected Behavior
Chat should work normally. Messages should send and get responses back. Having a stop sequence set on a model should not cause the frontend to crash and return no response.
Actual Behavior
Chat stopped returning responses entirely with no clear trigger. No generation request ever reached the backend, meaning the crash was happening in the browser before anything was sent. The
.splitTypeError appeared in the console every time a message was sent. This happened across every model tested, not just one. The fix was going to Settings -> General -> Advanced Parameters, and removing the stop sequence. However chat was already broken before any stop sequences were added, so MCP tool server misconfiguration may have also played a role.Steps to Reproduce
https://api.linkapi.ai/v1, Bearer token auth)localhost:8001and mcp-server-fetch atlocalhost:8002, both usinguvx mcpovia a batch filen/n/Human:,User:, and random strings likedkdkd).splitTypeError on every send attemptLogs & Screenshots
Additional Information
This looks related to #19486 and #19500 which had the same
.spliterror in the tool/middleware layer. In those issues a workaround was to add a single comma to the Function Name Filter List in the external tool server settings. I tried this and it did not fix the issue in my case. The root cause appears to be stop sequences being set on models, though MCP server misconfiguration may also be a contributing factor.Full troubleshooting conversation with the Open WebUI Kapa.AI bot on Discord.
My setup:
https://api.linkapi.ai/v1(OpenAI-compatible, Bearer token auth)uvx mcpo: DuckDuckGo on port 8001, mcp-server-fetch on port 8002@tjbck commented on GitHub (Mar 7, 2026):
Should be fixed in dev.
@Classic298 commented on GitHub (Mar 7, 2026):
c7d1d1e390