mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
[GH-ISSUE #9414] The model does nothing with the result of a native tool call #15496
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Banbury on GitHub (Feb 5, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/9414
Bug Report
Installation Method
Docker
Environment
Open WebUI Version: 0.5.9
Ollama (if applicable): 0.5.7
Operating System: Windows 11
Browser (if applicable): Firefox 134.0.2
Confirmation:
Expected Behavior:
The result of a native tool call is handed to the model for further processing.
Actual Behavior:
The model does nothing with the result of a native tool call.
Description
Bug Summary:
Native tool calling works in principle, but the the model is not called again with the result. So no answer is generated.
Reproduction Details
Steps to Reproduce:
Logs and Screenshots
Browser Console Logs:
console-export-2025-2-5_16-42-23.txt
Docker Container Logs:
2025-02-05 16:43:56 open-webui | INFO: 192.168.32.1:42608 - "POST /api/v1/chats/ea42d230-9aa4-4bc7-acbb-8ac2fe7b9490 HTTP/1.1" 200 OK
2025-02-05 16:43:56 open-webui | INFO: 192.168.32.1:42608 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
2025-02-05 16:43:56 open-webui | INFO [open_webui.utils.middleware] tools={'dice_roll': {'toolkit_id': 'dice_roller', 'callable': <function Tools.dice_roll at 0x7f18997a87c0>, 'spec': {'name': 'dice_roll', 'description': '\n Rolls a number of dice and returns the result.\n ', 'parameters': {'properties': {'dice_str': {'description': 'A text describing a how many dice should be rolled. E.g. "d100" is one 100-sided die, "2d6" are two six sided dice. A modifier also can be added, e.g. "3d10+2".', 'type': 'string'}}, 'required': ['dice_str'], 'type': 'object'}}, 'pydantic_model': <class 'open_webui.utils.tools.dice_roll'>, 'file_handler': False, 'citation': False}}
2025-02-05 16:43:57 open-webui | INFO: 192.168.32.1:42608 - "POST /api/chat/completions HTTP/1.1" 200 OK
2025-02-05 16:43:57 open-webui |
2025-02-05 16:43:57 open-webui |
2025-02-05 16:43:57 open-webui | {'index': 0, 'id': 'call_2cf4317d-058f-4b18-b592-578355b950fc', 'type': 'function', 'function': {'name': 'dice_roll', 'arguments': "{'dice_str': '2d10'}"}}
2025-02-05 16:43:57 open-webui |
2025-02-05 16:43:57 open-webui |
2025-02-05 16:43:57 open-webui | INFO: 192.168.32.1:42608 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
2025-02-05 16:43:57 open-webui | INFO: 192.168.32.1:42620 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
2025-02-05 16:43:57 open-webui | INFO: 192.168.32.1:42612 - "POST /api/chat/completed HTTP/1.1" 200 OK
2025-02-05 16:43:57 open-webui | INFO: 192.168.32.1:42608 - "POST /api/v1/chats/ea42d230-9aa4-4bc7-acbb-8ac2fe7b9490 HTTP/1.1" 200 OK
2025-02-05 16:43:57 open-webui | INFO: 192.168.32.1:42608 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
@tjbck commented on GitHub (Feb 5, 2025):
It is called, llama3 models are known for being unreliable.