issue: Output stops after first tool call with reasoning enabled (new regression in 0.6.34) #6712

Closed
opened 2025-11-11 17:04:00 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @Bickio on GitHub (Oct 20, 2025).

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.6.34

Ollama Version (if applicable)

No response

Operating System

MacOS 15.4.1

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

Model does its initial thinking, uses the tool natively and then continues its response

Actual Behavior

Model output stops after the first tool use

Steps to Reproduce

  • Configure a reasoning + native function execution model in litellm (I tested with Claude 4.5 Sonnet)

    • Use merge_reasoning_content_in_choices: true, but no other special settings
  • Configure litellm as an openai server in openwebui

  • Adjust the model settings in openwebui (either in admin, or chat settings):

    • Set Function Calling to Native
    • Set Reasoning Effort to medium
  • Configure an "external tool" connection in openwebui

  • Prompt the model with the tool enabled, in such a way that it will use the tool

Logs & Screenshots

SCREENSHOT FROM v0.6.34 (NOT WORKING):

Image

SCREENSHOT FROM v0.6.33 (WORKING):

Image

Additional Information

I've tested up and downgrading the versions multiple times, and rerunning the prompts multiple times. This is 100% reproducible and clearly a new regression between v0.6.33 and v0.6.34

Originally created by @Bickio on GitHub (Oct 20, 2025). ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.6.34 ### Ollama Version (if applicable) _No response_ ### Operating System MacOS 15.4.1 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior Model does its initial thinking, uses the tool natively and then continues its response ### Actual Behavior Model output stops after the first tool use ### Steps to Reproduce - Configure a reasoning + native function execution model in litellm (I tested with Claude 4.5 Sonnet) - Use `merge_reasoning_content_in_choices: true`, but no other special settings - Configure litellm as an openai server in openwebui - Adjust the model settings in openwebui (either in admin, or chat settings): - Set `Function Calling` to `Native` - Set `Reasoning Effort` to `medium` - Configure an "external tool" connection in openwebui - Prompt the model with the tool enabled, in such a way that it will use the tool ### Logs & Screenshots SCREENSHOT FROM v0.6.34 (NOT WORKING): <img width="711" height="153" alt="Image" src="https://github.com/user-attachments/assets/4b96887d-a061-46e1-b3e1-3fe944f0f060" /> SCREENSHOT FROM v0.6.33 (WORKING): <img width="986" height="414" alt="Image" src="https://github.com/user-attachments/assets/a9d8be49-7cbe-42dc-a190-cdaf2bac6171" /> ### Additional Information I've tested up and downgrading the versions multiple times, and rerunning the prompts multiple times. This is 100% reproducible and clearly a new regression between v0.6.33 and v0.6.34
GiteaMirror added the bug label 2025-11-11 17:04:00 -06:00
Author
Owner

@tjbck commented on GitHub (Oct 20, 2025):

Backend logs required here, most likely has to do with the Pipe Function implementation.

@tjbck commented on GitHub (Oct 20, 2025): Backend logs required here, most likely has to do with the Pipe Function implementation.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#6712