[GH-ISSUE #14313] issue: LLM doesn't use MCP tools #55874

Closed
opened 2026-05-05 18:11:55 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @KrFeher on GitHub (May 25, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/14313

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

image version of 1.93.0

Ollama Version (if applicable)

No response

Operating System

Ubuntu 24.04

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

OpenWebui LLM uses tool to answer question.

Actual Behavior

LLM doesn't use tool and refuses to acknowledge it was tools.

Steps to Reproduce

See screenshot first. LLM has access to 2x tools.
They are available via mcpo. Mcpo logs don't show any access happening.
This only happens if I pass in multiple tools via a config file, if I don't pass in a config file, it actually works and it picks up the tool.

Config file:

{
  "mcpServers": {
    "n8n_send_telegram": {
      "type": "sse",
      "url": "http://n8n:5678/mcp/send_telegram/sse"
    },
    "n8n_bin_dates": {
      "type": "sse",
      "url": "http://n8n:5678/mcp/get_bin_dates/sse"
    }
  }
}

Both of these tools are hosted on the same server (different docker container).

From my understanding:
n8n tool server works (I can also use it with another non-openwebui client)
mcpo works, as both tools are picked up by openwebui
something about how we instruct models to discover tools doesn't work when there's more than one tool.

Logs & Screenshots

Image
Image shows the LLM having access to 2x tools:
n8n_get_bin_dates and n8n_send_telegram

Image

Additional Information

No response

Originally created by @KrFeher on GitHub (May 25, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/14313 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version image version of 1.93.0 ### Ollama Version (if applicable) _No response_ ### Operating System Ubuntu 24.04 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior OpenWebui LLM uses tool to answer question. ### Actual Behavior LLM doesn't use tool and refuses to acknowledge it was tools. ### Steps to Reproduce See screenshot first. LLM has access to 2x tools. They are available via mcpo. Mcpo logs don't show any access happening. This only happens if I pass in multiple tools via a config file, if I don't pass in a config file, it actually works and it picks up the tool. Config file: ``` { "mcpServers": { "n8n_send_telegram": { "type": "sse", "url": "http://n8n:5678/mcp/send_telegram/sse" }, "n8n_bin_dates": { "type": "sse", "url": "http://n8n:5678/mcp/get_bin_dates/sse" } } } ``` Both of these tools are hosted on the same server (different docker container). From my understanding: ✅ n8n tool server works (I can also use it with another non-openwebui client) ✅ mcpo works, as both tools are picked up by openwebui ❌ something about how we instruct models to discover tools doesn't work when there's more than one tool. ### Logs & Screenshots ![Image](https://github.com/user-attachments/assets/e8b4beed-2448-4bd6-8d72-e0c10c586817) Image shows the LLM having access to 2x tools: n8n_get_bin_dates and n8n_send_telegram ![Image](https://github.com/user-attachments/assets/fa9e9a16-c8a1-43bb-9b85-1cef9ce9a5ab) ### Additional Information _No response_
GiteaMirror added the bug label 2026-05-05 18:11:55 -05:00
Author
Owner

@bryzgaloff commented on GitHub (May 25, 2025):

I encountered a similar problem. Here is how to solve it.

Please check your http://localhost:3000/admin/settings > Tools. It seems you have set the mcpo root as the tool server URL, such as http://mcpo:8000/. The tool name 'MCP OpenAPI Proxy' indicates this.

Instead, you should specify two subservers separately: http://mcpo:8000/n8n_send_telegram and http://mcpo:8000/n8n_bin_dates. This way, the tools modal window will display the actual names of the tools.

Refer to the Good and Bad examples in the section below: https://docs.openwebui.com/openapi-servers/open-webui#-optional-using-a-config-file-with-mcpo

This means:

  • When connecting a tool in Open WebUI, you must enter the full route to that specific tool — do NOT enter just the root URL (http://localhost:8000/).
  • Add each tool individually in Open WebUI Settings using their respective subpath URLs.
<!-- gh-comment-id:2908063541 --> @bryzgaloff commented on GitHub (May 25, 2025): I encountered a similar problem. Here is how to solve it. Please check your http://localhost:3000/admin/settings > Tools. It seems you have set the `mcpo` root as the tool server URL, such as `http://mcpo:8000/`. The tool name 'MCP OpenAPI Proxy' indicates this. Instead, **you should specify two subservers separately**: `http://mcpo:8000/n8n_send_telegram` and `http://mcpo:8000/n8n_bin_dates`. This way, the tools modal window will display the actual names of the tools. Refer to the Good and Bad examples in the section below: https://docs.openwebui.com/openapi-servers/open-webui#-optional-using-a-config-file-with-mcpo > This means: > - When connecting a tool in Open WebUI, you must enter the full route to that specific tool — do NOT enter just the root URL (http://localhost:8000/). > - Add each tool individually in Open WebUI Settings using their respective subpath URLs.
Author
Owner

@KrFeher commented on GitHub (May 25, 2025):

Thanks a lot, I've literaly just found this very second the issue!
See how it should look like:

Image

<!-- gh-comment-id:2908064426 --> @KrFeher commented on GitHub (May 25, 2025): Thanks a lot, I've literaly just found this very second the issue! See how it should look like: ![Image](https://github.com/user-attachments/assets/f6f7133f-6275-4784-a0ce-031d83de6066)
Author
Owner

@bryzgaloff commented on GitHub (May 25, 2025):

I actually came to this issue since my model (qwen3:8B) does not recognize the tools. Even though I believe the tools are configured properly.

@KrFeher, could you please confirm if your model can use the tools after the fix? I’m unsure if the model’s small size is the cause or if I need to investigate the tool configuration further.

<!-- gh-comment-id:2908065746 --> @bryzgaloff commented on GitHub (May 25, 2025): I actually came to this issue since my model (qwen3:8B) does not recognize the tools. Even though I believe the tools are configured properly. @KrFeher, could you please confirm if your model can use the tools after the fix? I’m unsure if the model’s small size is the cause or if I need to investigate the tool configuration further.
Author
Owner

@KrFeher commented on GitHub (May 25, 2025):

I actually came to this issue since my model (qwen3:8B) does not recognize the tools. Even though I believe the tools are configured properly.

@KrFeher, could you please confirm if your model can use the tools after the fix? I’m unsure if the model’s small size is the cause or if I need to investigate the tool configuration further.

I did find that tool use is inconsistent across models. The fact that it's so small might be a problem as well. Pick another model and test a few, even if it's remote.
And to answer your question, yes LLM I used in OpenWebUI was capable of using all tools after the fix, although I only tried gpt-4.1 and Sonnet 3.7. My home lab is a potato and can't run ollama.

<!-- gh-comment-id:2908070925 --> @KrFeher commented on GitHub (May 25, 2025): > I actually came to this issue since my model (qwen3:8B) does not recognize the tools. Even though I believe the tools are configured properly. > > [@KrFeher](https://github.com/KrFeher), could you please confirm if your model can use the tools after the fix? I’m unsure if the model’s small size is the cause or if I need to investigate the tool configuration further. I did find that tool use is inconsistent across models. The fact that it's so small might be a problem as well. Pick another model and test a few, even if it's remote. And to answer your question, yes LLM I used in OpenWebUI was capable of using all tools after the fix, although I only tried gpt-4.1 and Sonnet 3.7. My home lab is a potato and can't run ollama.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#55874