[GH-ISSUE #13903] issue: cant make the streaming of the output work when using native toolcalling #17068

Closed
opened 2026-04-19 22:50:06 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @basirsedighi on GitHub (May 15, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/13903

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

v0.6.9

Ollama Version (if applicable)

v0.6.8

Operating System

ubuntu

Browser (if applicable)

edge

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

I would expect that when doing a native-toolcalling the stream of tokens apears on the UI as they are generated.

Actual Behavior

I have setup a postgres database where i can talk to it using mcp and the mcpo proxy repo.
I have found it works very well. However i can not make the streaming work.

So when i do a prompt, a call is being made to the tool for retrival from the database, after retrival LLMs (qwen2.5 / mistral-mini) tokenstream seem not to work but rather the output comes all at once.

Steps to Reproduce

  1. setup a mcp-tool that can be called
  2. setup a mcpo server so you can call the tool nativly with OWUI
  3. observe the response

Logs & Screenshots

Image

Additional Information

I have tested this on mac downloading it with uvx and also on kubernetis cluster

Originally created by @basirsedighi on GitHub (May 15, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/13903 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version v0.6.9 ### Ollama Version (if applicable) v0.6.8 ### Operating System ubuntu ### Browser (if applicable) edge ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior I would expect that when doing a native-toolcalling the stream of tokens apears on the UI as they are generated. ### Actual Behavior I have setup a postgres database where i can talk to it using mcp and the mcpo proxy repo. I have found it works very well. However i can not make the streaming work. So when i do a prompt, a call is being made to the tool for retrival from the database, after retrival LLMs (qwen2.5 / mistral-mini) tokenstream seem not to work but rather the output comes all at once. ### Steps to Reproduce 1. setup a mcp-tool that can be called 2. setup a mcpo server so you can call the tool nativly with OWUI 3. observe the response ### Logs & Screenshots ![Image](https://github.com/user-attachments/assets/9f289fce-6ed9-4b64-aa3d-a14ea98f3bf1) ### Additional Information I have tested this on mac downloading it with uvx and also on kubernetis cluster
GiteaMirror added the bug label 2026-04-19 22:50:06 -05:00
Author
Owner

@tjbck commented on GitHub (May 15, 2025):

Ollama issue.

<!-- gh-comment-id:2883030950 --> @tjbck commented on GitHub (May 15, 2025): Ollama issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#17068