[GH-ISSUE #11417] Consistent Tool Call Ids #33296

Open
opened 2026-04-22 15:50:16 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @hbergmey on GitHub (Jul 14, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11417

The Ollama chat API allows a model to return multiple tool calls at once to a single request. This may also involve calls to the same tool just with different parameter values. In this case in order to send clearly associate any tool result with its tool call, it is necessary to specify a tool call id in the tool message. Unfortunately the chat API only allows specifying the tool name, which is not clear.

Take the weather tool from the documentation as example. You can specify a location and a temperature format. The user asks for the weather in London and Brussels, so the model answers with two tool calls. In the OpenAI API both have a unique tool call id. These you can use to send the history with two tool messages added, both clearly identified by the tool call id, in order to reflect, which temperature is belonging where.

The Ollama chat API allows no identification of tool results.

This is currently the main reason for me to choose the OpenAI API over the Ollama API, even though it has other drawbacks, such as not being able to set num_ctx or other model parameters per request.

Originally created by @hbergmey on GitHub (Jul 14, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11417 The Ollama chat API allows a model to return multiple tool calls at once to a single request. This may also involve calls to the same tool just with different parameter values. In this case in order to send clearly associate any tool result with its tool call, it is necessary to specify a tool call id in the tool message. Unfortunately the chat API only allows specifying the tool name, which is not clear. Take the weather tool from the documentation as example. You can specify a location and a temperature format. The user asks for the weather in London and Brussels, so the model answers with two tool calls. In the OpenAI API both have a unique tool call id. These you can use to send the history with two tool messages added, both clearly identified by the tool call id, in order to reflect, which temperature is belonging where. The Ollama chat API allows no identification of tool results. This is currently the main reason for me to choose the OpenAI API over the Ollama API, even though it has other drawbacks, such as not being able to set num_ctx or other model parameters per request.
GiteaMirror added the feature request label 2026-04-22 15:50:16 -05:00
Author
Owner

@drewd789 commented on GitHub (Mar 27, 2026):

Agreed that this should be implemented, but I have a question: Would using 'tool_call_id': tool_call.function.name + tool_call.function.arguments work as well?

<!-- gh-comment-id:4145754829 --> @drewd789 commented on GitHub (Mar 27, 2026): Agreed that this should be implemented, but I have a question: Would using `'tool_call_id': tool_call.function.name + tool_call.function.arguments` work as well?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#33296