[GH-ISSUE #11281] Is there a way we can add tool support for a particular type of model? #53952

Closed
opened 2026-04-29 05:00:06 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @qwerty108109 on GitHub (Jul 3, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11281

I was wondering if we can add tool support for this type of model.
https://huggingface.co/unsloth/DeepSeek-R1-0528-Qwen3-8B-GGUF

This is a highly compressed model with a lot better performance than I've ever seen from a compressed model.
It is MIT licensed.

If you look at this model in particular.
hf.co/unsloth/DeepSeek-R1-0528-Qwen3-8B-GGUF:Q4_K_M

You can pull it and you can run it and it runs a lot faster than regular deep seek models, but it does not have tool support.

Originally created by @qwerty108109 on GitHub (Jul 3, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11281 I was wondering if we can add tool support for this type of model. https://huggingface.co/unsloth/DeepSeek-R1-0528-Qwen3-8B-GGUF This is a highly compressed model with a lot better performance than I've ever seen from a compressed model. It is MIT licensed. If you look at this model in particular. hf.co/unsloth/DeepSeek-R1-0528-Qwen3-8B-GGUF:Q4_K_M You can pull it and you can run it and it runs a lot faster than regular deep seek models, but it does not have tool support.
GiteaMirror added the feature request label 2026-04-29 05:00:06 -05:00
Author
Owner

@qwerty108109 commented on GitHub (Jul 3, 2025):

It looks like the JSON template for this model does support tools.

tokenizer.chat_template	{%- if not add_generation_prompt is defined %} {%- set add_generation_prompt = false %} {%- endif %} {%- set ns = namespace(is_first=false, is_tool=false, is_output_first=true, system_prompt='', is_first_sp=true, is_last_user=false) %} {%- for message in messages %} {%- if message['role'] == 'system' %} {%- if ns.is_first_sp %} {%- set ns.system_prompt = ns.system_prompt + message['content'] %} {%- set ns.is_first_sp = false %} {%- else %} {%- set ns.system_prompt = ns.system_prompt + '\n\n' + message['content'] %} {%- endif %} {%- endif %} {%- endfor %} {#- Adapted from https://github.com/sgl-project/sglang/blob/main/examples/chat_template/tool_chat_template_deepseekr1.jinja #} {%- if tools is defined and tools is not none %} {%- set tool_ns = namespace(text='You are a helpful assistant with tool calling capabilities. ' + 'When a tool call is needed, you MUST use the following format to issue the call:\n' + '<|tool▁calls▁begin|><|tool▁call▁begin|>function<|tool▁sep|>FUNCTION_NAME\n' + '```json\n{"param1": "value1", "param2": "value2"}\n```<|tool▁call▁end|><|tool▁calls▁end|>\n\n' + 'Make sure the JSON is valid.' + '## Tools\n\n### Function\n\nYou have the following functions available:\n\n') %} {%- for tool in tools %} {%- set tool_ns.text = tool_ns.text + '\n```json\n' + (tool | tojson) + '\n```\n' %} {%- endfor %} {%- if ns.system_prompt|length != 0 %} {%- set ns.system_prompt = ns.system_prompt + '\n\n' + tool_ns.text %} {%- else %} {%- set ns.system_prompt = tool_ns.text %} {%- endif %} {%- endif %} {{- bos_token }} {{- ns.system_prompt }} {%- set last_index = (messages|length - 1) %} {%- for message in messages %} {%- set content = message['content'] %} {%- if message['role'] == 'user' %} {%- set ns.is_tool = false -%} {%- set ns.is_first = false -%} {%- set ns.is_last_user = true -%} {%- if loop.index0 == last_index %} {{- '<|User|>' + content }} {%- else %} {{- '<|User|>' + content + '<|Assistant|>'}} {%- endif %} {%- endif %} {%- if message['role'] == 'assistant' %} {%- if '</think>' in content %} {%- set content = (content.split('</think>')|last) %} {%- endif %} {%- endif %} {%- if message['role'] == 'assistant' and message['tool_calls'] is defined and message['tool_calls'] is not none %} {%- set ns.is_last_user = false -%} {%- if ns.is_tool %} {{- '<|tool▁outputs▁end|>'}} {%- endif %} {%- set ns.is_first = false %} {%- set ns.is_tool = false -%} {%- set ns.is_output_first = true %} {%- for tool in message['tool_calls'] %} {%- set arguments = tool['function']['arguments'] %} {%- if arguments is not string %} {%- set arguments = arguments|tojson %} {%- endif %} {%- if not ns.is_first %} {%- if content is none %} {{- '<|tool▁calls▁begin|><|tool▁call▁begin|>' + tool['type'] + '<|tool▁sep|>' + tool['function']['name'] + '\n' + '```json' + '\n' + arguments + '\n' + '```' + '<|tool▁call▁end|>'}} } {%- else %} {{- content + '<|tool▁calls▁begin|><|tool▁call▁begin|>' + tool['type'] + '<|tool▁sep|>' + tool['function']['name'] + '\n' + '```json' + '\n' + arguments + '\n' + '```' + '<|tool▁call▁end|>'}} {%- endif %} {%- set ns.is_first = true -%} {%- else %} {{- '\n' + '<|tool▁call▁begin|>' + tool['type'] + '<|tool▁sep|>' + tool['function']['name'] + '\n' + '```json' + '\n' + arguments + '\n' + '```' + '<|tool▁call▁end|>'}} {%- endif %} {%- endfor %} {{- '<|tool▁calls▁end|><|end▁of▁sentence|>'}} {%- endif %} {%- if message['role'] == 'assistant' and (message['tool_calls'] is not defined or message['tool_calls'] is none) %} {%- set ns.is_last_user = false -%} {%- if ns.is_tool %} {{- '<|tool▁outputs▁end|>' + content + '<|end▁of▁sentence|>'}} {%- set ns.is_tool = false -%} {%- else %} {{- content + '<|end▁of▁sentence���>'}} {%- endif %} {%- endif %} {%- if message['role'] == 'tool' %} {%- set ns.is_last_user = false -%} {%- set ns.is_tool = true -%} {%- if ns.is_output_first %} {{- '<|tool▁outputs▁begin|><|tool▁output▁begin|>' + content + '<|tool▁output▁end|>'}} {%- set ns.is_output_first = false %} {%- else %} {{- '\n<|tool▁output▁begin|>' + content + '<|tool▁output▁end|>'}} {%- endif %} {%- endif %} {%- endfor -%} {%- if ns.is_tool %} {{- '<|tool▁outputs▁end|>'}} {%- endif %} {#- if add_generation_prompt and not ns.is_last_user and not ns.is_tool #} {%- if add_generation_prompt and not ns.is_tool %} {{- '<|Assistant|>'}} {%- endif %}

But since it's being pulled directly from Huggin face, I guess we'll need to also pull down the template tokenizer.
ollama run hf.co/unsloth/DeepSeek-R1-0528-Qwen3-8B-GGUF:Q4_K_M

<!-- gh-comment-id:3030927031 --> @qwerty108109 commented on GitHub (Jul 3, 2025): It looks like the JSON template for this model does support tools. ```json tokenizer.chat_template {%- if not add_generation_prompt is defined %} {%- set add_generation_prompt = false %} {%- endif %} {%- set ns = namespace(is_first=false, is_tool=false, is_output_first=true, system_prompt='', is_first_sp=true, is_last_user=false) %} {%- for message in messages %} {%- if message['role'] == 'system' %} {%- if ns.is_first_sp %} {%- set ns.system_prompt = ns.system_prompt + message['content'] %} {%- set ns.is_first_sp = false %} {%- else %} {%- set ns.system_prompt = ns.system_prompt + '\n\n' + message['content'] %} {%- endif %} {%- endif %} {%- endfor %} {#- Adapted from https://github.com/sgl-project/sglang/blob/main/examples/chat_template/tool_chat_template_deepseekr1.jinja #} {%- if tools is defined and tools is not none %} {%- set tool_ns = namespace(text='You are a helpful assistant with tool calling capabilities. ' + 'When a tool call is needed, you MUST use the following format to issue the call:\n' + '<|tool▁calls▁begin|><|tool▁call▁begin|>function<|tool▁sep|>FUNCTION_NAME\n' + '```json\n{"param1": "value1", "param2": "value2"}\n```<|tool▁call▁end|><|tool▁calls▁end|>\n\n' + 'Make sure the JSON is valid.' + '## Tools\n\n### Function\n\nYou have the following functions available:\n\n') %} {%- for tool in tools %} {%- set tool_ns.text = tool_ns.text + '\n```json\n' + (tool | tojson) + '\n```\n' %} {%- endfor %} {%- if ns.system_prompt|length != 0 %} {%- set ns.system_prompt = ns.system_prompt + '\n\n' + tool_ns.text %} {%- else %} {%- set ns.system_prompt = tool_ns.text %} {%- endif %} {%- endif %} {{- bos_token }} {{- ns.system_prompt }} {%- set last_index = (messages|length - 1) %} {%- for message in messages %} {%- set content = message['content'] %} {%- if message['role'] == 'user' %} {%- set ns.is_tool = false -%} {%- set ns.is_first = false -%} {%- set ns.is_last_user = true -%} {%- if loop.index0 == last_index %} {{- '<|User|>' + content }} {%- else %} {{- '<|User|>' + content + '<|Assistant|>'}} {%- endif %} {%- endif %} {%- if message['role'] == 'assistant' %} {%- if '</think>' in content %} {%- set content = (content.split('</think>')|last) %} {%- endif %} {%- endif %} {%- if message['role'] == 'assistant' and message['tool_calls'] is defined and message['tool_calls'] is not none %} {%- set ns.is_last_user = false -%} {%- if ns.is_tool %} {{- '<|tool▁outputs▁end|>'}} {%- endif %} {%- set ns.is_first = false %} {%- set ns.is_tool = false -%} {%- set ns.is_output_first = true %} {%- for tool in message['tool_calls'] %} {%- set arguments = tool['function']['arguments'] %} {%- if arguments is not string %} {%- set arguments = arguments|tojson %} {%- endif %} {%- if not ns.is_first %} {%- if content is none %} {{- '<|tool▁calls▁begin|><|tool▁call▁begin|>' + tool['type'] + '<|tool▁sep|>' + tool['function']['name'] + '\n' + '```json' + '\n' + arguments + '\n' + '```' + '<|tool▁call▁end|>'}} } {%- else %} {{- content + '<|tool▁calls▁begin|><|tool▁call▁begin|>' + tool['type'] + '<|tool▁sep|>' + tool['function']['name'] + '\n' + '```json' + '\n' + arguments + '\n' + '```' + '<|tool▁call▁end|>'}} {%- endif %} {%- set ns.is_first = true -%} {%- else %} {{- '\n' + '<|tool▁call▁begin|>' + tool['type'] + '<|tool▁sep|>' + tool['function']['name'] + '\n' + '```json' + '\n' + arguments + '\n' + '```' + '<|tool▁call▁end|>'}} {%- endif %} {%- endfor %} {{- '<|tool▁calls▁end|><|end▁of▁sentence|>'}} {%- endif %} {%- if message['role'] == 'assistant' and (message['tool_calls'] is not defined or message['tool_calls'] is none) %} {%- set ns.is_last_user = false -%} {%- if ns.is_tool %} {{- '<|tool▁outputs▁end|>' + content + '<|end▁of▁sentence|>'}} {%- set ns.is_tool = false -%} {%- else %} {{- content + '<|end▁of▁sentence���>'}} {%- endif %} {%- endif %} {%- if message['role'] == 'tool' %} {%- set ns.is_last_user = false -%} {%- set ns.is_tool = true -%} {%- if ns.is_output_first %} {{- '<|tool▁outputs▁begin|><|tool▁output▁begin|>' + content + '<|tool▁output▁end|>'}} {%- set ns.is_output_first = false %} {%- else %} {{- '\n<|tool▁output▁begin|>' + content + '<|tool▁output▁end|>'}} {%- endif %} {%- endif %} {%- endfor -%} {%- if ns.is_tool %} {{- '<|tool▁outputs▁end|>'}} {%- endif %} {#- if add_generation_prompt and not ns.is_last_user and not ns.is_tool #} {%- if add_generation_prompt and not ns.is_tool %} {{- '<|Assistant|>'}} {%- endif %} ``` But since it's being pulled directly from Huggin face, I guess we'll need to also pull down the template tokenizer. ollama run hf.co/unsloth/DeepSeek-R1-0528-Qwen3-8B-GGUF:Q4_K_M
Author
Owner

@rick-github commented on GitHub (Jul 3, 2025):

#8517

<!-- gh-comment-id:3031191477 --> @rick-github commented on GitHub (Jul 3, 2025): #8517
Author
Owner

@qwerty108109 commented on GitHub (Jul 3, 2025):

This ticket is closed to being a duplicate.

<!-- gh-comment-id:3032793317 --> @qwerty108109 commented on GitHub (Jul 3, 2025): This ticket is closed to being a duplicate.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#53952