[GH-ISSUE #5967] "llama3.1:70b does not support tools" #65766

Closed
opened 2026-05-03 22:36:22 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @SinanAkkoyun on GitHub (Jul 26, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5967

What is the issue?

Traceback (most recent call last):
  File "/home/ai/ml/llm/inference/ollama/function_calling.py", line 3, in <module>
    response = ollama.chat(
               ^^^^^^^^^^^^
  File "/home/ai/.mconda3/envs/ollama/lib/python3.11/site-packages/ollama/_client.py", line 235, in chat
    return self._request_stream(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/ai/.mconda3/envs/ollama/lib/python3.11/site-packages/ollama/_client.py", line 98, in _request_stream
    return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ai/.mconda3/envs/ollama/lib/python3.11/site-packages/ollama/_client.py", line 74, in _request
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: llama3.1:70b does not support tools

Solution: Add support for all sub tags

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.3.0

Originally created by @SinanAkkoyun on GitHub (Jul 26, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5967 ### What is the issue? ``` Traceback (most recent call last): File "/home/ai/ml/llm/inference/ollama/function_calling.py", line 3, in <module> response = ollama.chat( ^^^^^^^^^^^^ File "/home/ai/.mconda3/envs/ollama/lib/python3.11/site-packages/ollama/_client.py", line 235, in chat return self._request_stream( ^^^^^^^^^^^^^^^^^^^^^ File "/home/ai/.mconda3/envs/ollama/lib/python3.11/site-packages/ollama/_client.py", line 98, in _request_stream return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ai/.mconda3/envs/ollama/lib/python3.11/site-packages/ollama/_client.py", line 74, in _request raise ResponseError(e.response.text, e.response.status_code) from None ollama._types.ResponseError: llama3.1:70b does not support tools ``` Solution: Add support for all sub tags ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.0
GiteaMirror added the bug label 2026-05-03 22:36:22 -05:00
Author
Owner

@rick-github commented on GitHub (Jul 26, 2024):

The initial release of the llama3.1 models didn't have a template that supported tools, if you re-pull it you should get the update.

<!-- gh-comment-id:2252363820 --> @rick-github commented on GitHub (Jul 26, 2024): The initial release of the llama3.1 models didn't have a template that supported tools, if you re-pull it you should get the update.
Author
Owner

@StarPet commented on GitHub (Jul 26, 2024):

Is there a property of the model (e.g. via /show API) which tells if the model supports tools?

<!-- gh-comment-id:2252647936 --> @StarPet commented on GitHub (Jul 26, 2024): Is there a property of the model (e.g. via /show API) which tells if the model supports tools?
Author
Owner

@rick-github commented on GitHub (Jul 26, 2024):

Not yet, there is an open ticket: https://github.com/ollama/ollama/issues/5794

<!-- gh-comment-id:2252791463 --> @rick-github commented on GitHub (Jul 26, 2024): Not yet, there is an open ticket: https://github.com/ollama/ollama/issues/5794
Author
Owner

@SinanAkkoyun commented on GitHub (Jul 26, 2024):

@rick-github Thank you so much!

<!-- gh-comment-id:2252859334 --> @SinanAkkoyun commented on GitHub (Jul 26, 2024): @rick-github Thank you so much!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#65766