mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-08 12:58:11 -05:00
[PR #6836] [CLOSED] feat: A new parameter model called "native_tool_call" is added. #8755
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/open-webui/open-webui/pull/6836
Author: @smonux
Created: 11/10/2024
Status: ❌ Closed
Base:
dev← Head:dev📝 Commits (1)
6ab6379feat: A new parameter model called "native_tool_call" is present.📊 Changes
6 files changed (+346 additions, -39 deletions)
View changed files
📝
backend/open_webui/apps/ollama/main.py(+2 -0)📝
backend/open_webui/apps/openai/main.py(+4 -0)📝
backend/open_webui/main.py(+294 -39)📝
backend/open_webui/utils/payload.py(+2 -0)📝
src/lib/components/chat/Chat.svelte(+13 -0)📝
src/lib/components/chat/Settings/Advanced/AdvancedParams.svelte(+31 -0)📄 Description
(squashed commit for easier review)
A new parameter model called "native_tool_call" is added.
If false (the default), the behaviour when there are tools enabled doesn't change: a prompt is built which asks the model to generate a function call compatible with the offered tools. If it's valid the ouput is the added to the context.
If true, the native mechanism that the API has is used instead. That means that the responses have to be inspected to detect if a tool_call has to be handled and act accordingly. This allows more complex behaviours where the model can try several strategies and call tools knowing the output of others.
Ollama tool calling API only supports non streaming requests. In case both streaming and native_tool call are present, native tool calling is silently ignored.
Testing
The feature has been tested through the ui and using a script which calls these endpoints using
/api/chat/completions
/ollama/api/chat
With the OpenAI API, these models have been tested, with varying success
gpt-4o-mini
qwen/qwen-2.5-72b-instruct (OpenRouter endpoint)
anthropic/claude-3-haiku (OpenRouter endpoint)
deepseek/deepseek-chat (OpenRouter endpoint)
google/gemini-pro (OpenRouter endpoint)
cohere/command-r-plus (OpenRouter endpoint)
-Fails
meta-llama/llama-3.1-70b-instruct (OpenRouter endpoint)
With the Ollama API, only llama3.2:3b has been tested.
These are prompts used with two functions provided:
The full output is attached to the PR.
Before submitting, make sure you've checked the following:
devbranch.If the PR gets accepted, I will update the docs accordingly.
ntation?
Changelog Entry
Description
A new parameter model called "native_tool_call" is added. If set to true, the tool calls are performed using the API and not the prompt based approach used until now.
output.txt
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.