[GH-ISSUE #6114] llama3-groq-tool-use can't request 2 tools at once but llama3.1 could do it #29580

Open
opened 2026-04-22 08:34:29 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @Hor1zonZzz on GitHub (Aug 1, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6114

What is the issue?

My code is following

model = ChatOllama(model="llama3.1")
from langchain_core.pydantic_v1 import BaseModel, Field
def add(a: int, b: int) -> int:
"""Add two integers.
Args:
a: First integer
b: Second integer
"""
return a + b
def multiply(a: int, b: int) -> int:
"""Multiply two integers.
Args:
a: First integer
b: Second integer
"""
return a * b
llm_with_tools = model.bind_tools([add, multiply])
messages = [
HumanMessage("What is 3 * 12? Also, what is 11 + 49?"),
]
ai_msg = llm_with_tools.invoke(
messages
)
print(ai_msg.tool_calls)

The output is
[{'name': 'multiply', 'args': {'a': 3, 'b': 12}, 'id': '75528960-e9fe-4166-b531-46992d58cbb3', 'type': 'tool_call'}, {'name': 'add', 'args': {'a': 11, 'b': 49}, 'id': '03576d56-4118-49d6-b410-d02f88447dd7', 'type': 'tool_call'}]

**When I change the model name to "llama3-groq-tool-use" **
[{'name': 'multiply', 'args': {'a': 5, 'b': 5}, 'id': '6d9ced85-fdcc-441b-9b53-de609086d468', 'type': 'tool_call'}]
**Just one tool has been use

OS

No response

GPU

No response

CPU

No response

Ollama version

0.3.0

Originally created by @Hor1zonZzz on GitHub (Aug 1, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6114 ### What is the issue? **My code is following** model = ChatOllama(model="llama3.1") from langchain_core.pydantic_v1 import BaseModel, Field def add(a: int, b: int) -> int: """Add two integers. Args: a: First integer b: Second integer """ return a + b def multiply(a: int, b: int) -> int: """Multiply two integers. Args: a: First integer b: Second integer """ return a * b llm_with_tools = model.bind_tools([add, multiply]) messages = [ HumanMessage("What is 3 * 12? Also, what is 11 + 49?"), ] ai_msg = llm_with_tools.invoke( messages ) print(ai_msg.tool_calls) **The output is** [{'name': 'multiply', 'args': {'a': 3, 'b': 12}, 'id': '75528960-e9fe-4166-b531-46992d58cbb3', 'type': 'tool_call'}, {'name': 'add', 'args': {'a': 11, 'b': 49}, 'id': '03576d56-4118-49d6-b410-d02f88447dd7', 'type': 'tool_call'}] **When I change the model name to "llama3-groq-tool-use" ** [{'name': 'multiply', 'args': {'a': 5, 'b': 5}, 'id': '6d9ced85-fdcc-441b-9b53-de609086d468', 'type': 'tool_call'}] **Just one tool has been use ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.3.0
GiteaMirror added the bug label 2026-04-22 08:34:29 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29580