mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[PR #7021] [CLOSED] feat: Added native_tool_call model option which enables tool use through API calls #60856
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/open-webui/open-webui/pull/7021
Author: @smonux
Created: 11/18/2024
Status: ❌ Closed
Base:
dev← Head:dev📝 Commits (10+)
bfc5f6bfeat: Add tool handling and response modification for non-streaming responses2a51781feat: Complete handle_nonstreaming_response with tool call handling6bcfdaafix: resolve undefined names 'call_next' and 'response' in handle_nonstreaming_response83191dbrefactor: remove unused imports in main.pyc17c9e2fix: correct variable name in nonstreaming response handlereb1366bfix for streaming6777210incremental fixes nonstreamingc19c7b5fix: Non streaming response is working712f82efix: now streaming works0a902fcrefac📊 Changes
57 files changed (+571 additions, -53 deletions)
View changed files
📝
backend/open_webui/apps/ollama/main.py(+2 -0)📝
backend/open_webui/apps/openai/main.py(+4 -0)📝
backend/open_webui/main.py(+455 -51)📝
backend/open_webui/static/assets/pdf-style.css(+2 -2)📝
backend/open_webui/utils/payload.py(+2 -0)📝
backend/open_webui/utils/tools.py(+13 -0)📝
src/lib/components/chat/Chat.svelte(+13 -0)📝
src/lib/components/chat/Settings/Advanced/AdvancedParams.svelte(+29 -0)📝
src/lib/components/chat/Settings/General.svelte(+3 -0)📝
src/lib/i18n/locales/ar-BH/translation.json(+1 -0)📝
src/lib/i18n/locales/bg-BG/translation.json(+1 -0)📝
src/lib/i18n/locales/bn-BD/translation.json(+1 -0)📝
src/lib/i18n/locales/ca-ES/translation.json(+1 -0)📝
src/lib/i18n/locales/ceb-PH/translation.json(+1 -0)📝
src/lib/i18n/locales/cs-CZ/translation.json(+1 -0)📝
src/lib/i18n/locales/da-DK/translation.json(+1 -0)📝
src/lib/i18n/locales/de-DE/translation.json(+1 -0)📝
src/lib/i18n/locales/dg-DG/translation.json(+1 -0)📝
src/lib/i18n/locales/el-GR/translation.json(+1 -0)📝
src/lib/i18n/locales/en-GB/translation.json(+1 -0)...and 37 more files
📄 Description
A new parameter model called "native_tool_call" is added.
If false (the default), the behaviour when there are tools enabled doesn't change: a prompt is built which asks the model to generate a function call compatible with the offered tools. If it's valid the ouput is the added to the context.
If true, the native mechanism that the API has is used instead. That means that the responses have to be inspected to detect if a tool_call has to be handled and act accordingly. This allows more complex behaviours where the model can try several strategies and call tools knowing the output of others.
Ollama tool calling API only supports non streaming requests. In case both streaming and native_tool call are present, native tool calling is silently ignored.
Testing
The feature has been tested through the ui and using a script which calls these endpoints using
/api/chat/completions
/ollama/api/chat
With the OpenAI API, these models have been tested, with varying success
-Fails
meta-llama/llama-3.1-70b-instruct (OpenRouter endpoint)
With the Ollama API, only llama3.2:3b has been tested.
These are prompts used with two functions provided:
The full output is attached to the PR.output.txt
devbranch.Will update docs if gets accepted.
Changelog Entry
Description
A new parameter model called "native_tool_call" is added. If set to true, the tool calls are performed using the API and instead of the prompt based approach which is the default.
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.