[GH-ISSUE #5793] ollama 0.2.7 function call error "llama3 does not support tools" #50121

Closed
opened 2026-04-28 14:11:34 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @liseri on GitHub (Jul 19, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5793

What is the issue?

ollama 0.2.7 function call error "llama3 does not support tools"

  1. install ollama 0.2.7 in docker
  2. ollama pull llama3
  3. curl
curl --location --request POST 'http://localhost:11434/v1/chat/completions' \
--header 'Content-Type: application/json' \
--data-raw '{
    "messages": [
        {
            "content": "what'\''s the weather of beijing",
            "role": "user"
        }
    ],
    "model": "llama3",
    "n": 1,
    "temperature": 0.9,
    "tools": [
        {
            "type": "function",
            "function": {
                "name": "get_weather",
                "description": "get weather of location or city",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "query": {
                            "location": "location or city",
                            "type": "string"
                        }
                    },
                    "required": [
                        "query"
                    ]
                }
            }
        }
    ]
}'
  1. error
{
    "error": {
        "message": "llama3 does not support tools",
        "type": "api_error",
        "param": null,
        "code": null
    }
}

OS

Docker

GPU

Nvidia

CPU

Intel

Ollama version

0.2.7

Originally created by @liseri on GitHub (Jul 19, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5793 ### What is the issue? ollama 0.2.7 function call error "llama3 does not support tools" 1. install ollama 0.2.7 in docker 2. ollama pull llama3 3. curl ``` curl --location --request POST 'http://localhost:11434/v1/chat/completions' \ --header 'Content-Type: application/json' \ --data-raw '{ "messages": [ { "content": "what'\''s the weather of beijing", "role": "user" } ], "model": "llama3", "n": 1, "temperature": 0.9, "tools": [ { "type": "function", "function": { "name": "get_weather", "description": "get weather of location or city", "parameters": { "type": "object", "properties": { "query": { "location": "location or city", "type": "string" } }, "required": [ "query" ] } } } ] }' ``` 4. error ``` { "error": { "message": "llama3 does not support tools", "type": "api_error", "param": null, "code": null } } ``` ### OS Docker ### GPU Nvidia ### CPU Intel ### Ollama version 0.2.7
GiteaMirror added the bug label 2026-04-28 14:11:34 -05:00
Author
Owner

@rick-github commented on GitHub (Jul 19, 2024):

The model itself doesn't support tool calls. You can work around that by specifying a system prompt that provides instruction and tools that deal with the user request:

tools='
    "tools": [
        {
            "type": "function",
            "function": {
                "name": "get_weather",
                "description": "get weather of location or city",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "query": {
                            "location": "location or city",
                            "type": "string"
                        }
                    },
                    "required": [
                        "query"
                    ]
                }
            }
        }
    ]
'

system='You are a helpful assistant that takes a question and finds the most appropriate tool or tools to execute, along with the parameters required to run the tool.  Respond as JSON using the following schema: {"functionName": "function name", "parameters": [{"parameterName", "name of parameter", "parameterValue": "value of parameter"}]}.  The tools are: '
content=$(echo "$system" "$tools" | jq -sR)

curl -s --location --request POST 'http://localhost:11434/v1/chat/completions' \
--header 'Content-Type: application/json' \
--data-raw '{
    "messages": [
        {
            "role": "system",
            "content": '"$content"'
        },
        {
            "content": "what'\''s the weather of beijing",
            "role": "user"
        }
    ],
    "model": "llama3",
    "n": 1,
    "temperature": 0.9,
    "format": "json"
}'
<!-- gh-comment-id:2238938100 --> @rick-github commented on GitHub (Jul 19, 2024): The model itself doesn't support tool calls. You can work around that by specifying a system prompt that provides instruction and tools that deal with the user request: ```sh tools=' "tools": [ { "type": "function", "function": { "name": "get_weather", "description": "get weather of location or city", "parameters": { "type": "object", "properties": { "query": { "location": "location or city", "type": "string" } }, "required": [ "query" ] } } } ] ' system='You are a helpful assistant that takes a question and finds the most appropriate tool or tools to execute, along with the parameters required to run the tool. Respond as JSON using the following schema: {"functionName": "function name", "parameters": [{"parameterName", "name of parameter", "parameterValue": "value of parameter"}]}. The tools are: ' content=$(echo "$system" "$tools" | jq -sR) curl -s --location --request POST 'http://localhost:11434/v1/chat/completions' \ --header 'Content-Type: application/json' \ --data-raw '{ "messages": [ { "role": "system", "content": '"$content"' }, { "content": "what'\''s the weather of beijing", "role": "user" } ], "model": "llama3", "n": 1, "temperature": 0.9, "format": "json" }' ```
Author
Owner

@vertrue commented on GitHub (Jul 19, 2024):

By the way, is there any llama3 8b/70b models that can work with tools?

I was trying MaziyarPanahi/Meta-Llama-3-70B-Instruct-GGUF on HF and tools just don't work
Default ollama llama3:70b also don't support tools

Although, groq is using meta-llama/Meta-Llama-3-70B-Instruct and it supports functions calling

Is it possible to specify what specific models support tools and what are not? Cause it is not clear for me

<!-- gh-comment-id:2238995429 --> @vertrue commented on GitHub (Jul 19, 2024): By the way, is there any llama3 8b/70b models that can work with tools? I was trying MaziyarPanahi/Meta-Llama-3-70B-Instruct-GGUF on HF and tools just don't work Default ollama llama3:70b also don't support tools Although, groq is using meta-llama/Meta-Llama-3-70B-Instruct and it supports functions calling Is it possible to specify what specific models support tools and what are not? Cause it is not clear for me
Author
Owner

@rick-github commented on GitHub (Jul 19, 2024):

https://ollama.com/library/llama3-groq-tool-use supports tool use. I just filed #5794 to expose the capabilities of a model.

<!-- gh-comment-id:2239003008 --> @rick-github commented on GitHub (Jul 19, 2024): https://ollama.com/library/llama3-groq-tool-use supports tool use. I just filed #5794 to expose the capabilities of a model.
Author
Owner

@vertrue commented on GitHub (Jul 19, 2024):

Omg, thanks!

<!-- gh-comment-id:2239065559 --> @vertrue commented on GitHub (Jul 19, 2024): Omg, thanks!
Author
Owner

@liseri commented on GitHub (Jul 22, 2024):

@rick-github thanks;

<!-- gh-comment-id:2241888743 --> @liseri commented on GitHub (Jul 22, 2024): @rick-github thanks;
Author
Owner

@jmorganca commented on GitHub (Jul 22, 2024):

Indeed! llama3 doesn't support tool calling since it wasn't trained on that, however as @rick-github mentioned https://ollama.com/library/llama3-groq-tool-use is a great one to try that is. @rick-github thanks for the issue as well, that's a great idea!

<!-- gh-comment-id:2241890449 --> @jmorganca commented on GitHub (Jul 22, 2024): Indeed! `llama3` doesn't support tool calling since it wasn't trained on that, however as @rick-github mentioned https://ollama.com/library/llama3-groq-tool-use is a great one to try that is. @rick-github thanks for the issue as well, that's a great idea!
Author
Owner

@liseri commented on GitHub (Jul 26, 2024):

llama3.1 support tool call!!! 喜大普奔

<!-- gh-comment-id:2252287318 --> @liseri commented on GitHub (Jul 26, 2024): llama3.1 support tool call!!! 喜大普奔
Author
Owner

@masterwishx commented on GitHub (Oct 7, 2024):

Sorry, can some one explain ?

im trying to work with ollama and Claude dev in vscode :

image

<!-- gh-comment-id:2396086649 --> @masterwishx commented on GitHub (Oct 7, 2024): Sorry, can some one explain ? im trying to work with ollama and Claude dev in vscode : ![image](https://github.com/user-attachments/assets/dc08c1cb-d929-4552-8f2c-53bfef729957)
Author
Owner

@stiliajohny commented on GitHub (Feb 23, 2025):

is there a model that supports tools ? equal to llama ?

<!-- gh-comment-id:2676467520 --> @stiliajohny commented on GitHub (Feb 23, 2025): is there a model that supports tools ? equal to llama ?
Author
Owner

@rick-github commented on GitHub (Feb 23, 2025):

https://ollama.com/search?c=tools

<!-- gh-comment-id:2676468451 --> @rick-github commented on GitHub (Feb 23, 2025): https://ollama.com/search?c=tools
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#50121