mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #14313] issue: LLM doesn't use MCP tools #55874
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @KrFeher on GitHub (May 25, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/14313
Check Existing Issues
Installation Method
Docker
Open WebUI Version
image version of 1.93.0
Ollama Version (if applicable)
No response
Operating System
Ubuntu 24.04
Browser (if applicable)
No response
Confirmation
README.md.Expected Behavior
OpenWebui LLM uses tool to answer question.
Actual Behavior
LLM doesn't use tool and refuses to acknowledge it was tools.
Steps to Reproduce
See screenshot first. LLM has access to 2x tools.
They are available via mcpo. Mcpo logs don't show any access happening.
This only happens if I pass in multiple tools via a config file, if I don't pass in a config file, it actually works and it picks up the tool.
Config file:
Both of these tools are hosted on the same server (different docker container).
From my understanding:
✅ n8n tool server works (I can also use it with another non-openwebui client)
✅ mcpo works, as both tools are picked up by openwebui
❌ something about how we instruct models to discover tools doesn't work when there's more than one tool.
Logs & Screenshots
Image shows the LLM having access to 2x tools:
n8n_get_bin_dates and n8n_send_telegram
Additional Information
No response
@bryzgaloff commented on GitHub (May 25, 2025):
I encountered a similar problem. Here is how to solve it.
Please check your http://localhost:3000/admin/settings > Tools. It seems you have set the
mcporoot as the tool server URL, such ashttp://mcpo:8000/. The tool name 'MCP OpenAPI Proxy' indicates this.Instead, you should specify two subservers separately:
http://mcpo:8000/n8n_send_telegramandhttp://mcpo:8000/n8n_bin_dates. This way, the tools modal window will display the actual names of the tools.Refer to the Good and Bad examples in the section below: https://docs.openwebui.com/openapi-servers/open-webui#-optional-using-a-config-file-with-mcpo
@KrFeher commented on GitHub (May 25, 2025):
Thanks a lot, I've literaly just found this very second the issue!
See how it should look like:
@bryzgaloff commented on GitHub (May 25, 2025):
I actually came to this issue since my model (qwen3:8B) does not recognize the tools. Even though I believe the tools are configured properly.
@KrFeher, could you please confirm if your model can use the tools after the fix? I’m unsure if the model’s small size is the cause or if I need to investigate the tool configuration further.
@KrFeher commented on GitHub (May 25, 2025):
I did find that tool use is inconsistent across models. The fact that it's so small might be a problem as well. Pick another model and test a few, even if it's remote.
And to answer your question, yes LLM I used in OpenWebUI was capable of using all tools after the fix, although I only tried gpt-4.1 and Sonnet 3.7. My home lab is a potato and can't run ollama.