mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #21832] issue: Post-tool “thinking” text leaks outside reasoning tags again #58254
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @kksaohun on GitHub (Feb 24, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/21832
Check Existing Issues
Installation Method
Docker
Open WebUI Version
v0.8.5 (latest)
Ollama Version (if applicable)
Operating System
Linux
Browser (if applicable)
Edge
Confirmation
README.md.Expected Behavior
Thinking shoule be hidden under "Thought for ....".
Actual Behavior
The LLM is MiniMax M2.5 on VLLM, via OpenAI-compatible API. OpenWebUI's "native" tool calling.
(It looks incredibly similar to #16973, which is older and closed. Maybe it's behaviour that happens again in certain circumstances?)
Steps to Reproduce
Logs & Screenshots
Browser (does not seem relevant):
docker (does not seem relevant):
Additional Information
No response
@Classic298 commented on GitHub (Feb 24, 2026):
I cant reproduce with hosted MiniMax. Works just fine. Could this be an issue with vLLM?
@kksaohun commented on GitHub (Feb 24, 2026):
Not sure. VLLM is being run with the "official" recommended parameters from here: https://docs.vllm.ai/projects/recipes/en/latest/MiniMax/MiniMax-M2.html#launching-m25m21m2-with-vllm
@kksaohun commented on GitHub (Feb 24, 2026):
Could be an incompatibility between vllm's
--reasoning-parser minimax_m2_append_thinkand OpenWebUI...?@tjbck commented on GitHub (Feb 24, 2026):
Likely an inference side issue.