Add support for parallel_tool_calls #5963

Open
opened 2025-11-12 13:17:36 -06:00 by GiteaMirror · 3 comments
Owner

Originally created by @Pign on GitHub (Feb 16, 2025).

Originally assigned to: @ParthSareen on GitHub.

Adding support for parallel_tool_calls to be able to specify whether the model is allowed to call several tools at once would be nice.

Pinging @ParthSareen as he requested.

Originally created by @Pign on GitHub (Feb 16, 2025). Originally assigned to: @ParthSareen on GitHub. Adding support for parallel_tool_calls to be able to specify whether the model is allowed to call several tools at once would be nice. Pinging @ParthSareen as he requested.
GiteaMirror added the feature request label 2025-11-12 13:17:36 -06:00
Author
Owner

@ParthSareen commented on GitHub (Feb 18, 2025):

Thanks for the ticket @Pign will look into this :)

@ParthSareen commented on GitHub (Feb 18, 2025): Thanks for the ticket @Pign will look into this :)
Author
Owner

@masfernandez commented on GitHub (Aug 27, 2025):

Any update on this?

@masfernandez commented on GitHub (Aug 27, 2025): Any update on this?
Author
Owner

@tcztzy commented on GitHub (Nov 11, 2025):

llama.cpp already support this parameter.

https://github.com/ggml-org/llama.cpp/pull/15647

@tcztzy commented on GitHub (Nov 11, 2025): llama.cpp already support this parameter. https://github.com/ggml-org/llama.cpp/pull/15647
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama-ollama#5963