[GH-ISSUE #7004] Allow CodeLlama to use tools #66492

Closed
opened 2026-05-04 06:47:59 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @mikegehard on GitHub (Sep 27, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7004

I would like to build a tool that uses test output to help an AI assistant iteratively perform a software refactoring.

In order to do this, I need to allow the model to run the code and I was thinking that an external tool would be perfect for that.

Be as of today, it doesn't look like the CodeLlama model has that capability?

It looks like Llama3.2 has that ability so I'm not completely blocked but I thought that using a model tuned for code/software development would give me better results.

Originally created by @mikegehard on GitHub (Sep 27, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7004 I would like to build a tool that uses test output to help an AI assistant iteratively perform a software refactoring. In order to do this, I need to allow the model to run the code and I was thinking that an external tool would be perfect for that. Be as of today, it doesn't look like the CodeLlama model has that capability? It looks like Llama3.2 has that ability so I'm not completely blocked but I thought that using a model tuned for code/software development would give me better results.
GiteaMirror added the feature request label 2026-05-04 06:47:59 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 28, 2024):

Any model can use tools with an appropriate system prompt. However, if the model hasn't been trained for tool use, results will vary. Now that ollama supports native tool use, you could also modify the TEMPLATE to handle tools during query processing. Since codellama is based on llama, it may be as simple as taking a tool enabled template from a llama model and inserting it into the Modelfile of a copy of codellama (probably not, though):

$ ollama show --modelfile codellama:latest | sed -e '/TEMPLATE.*/,/"/d' -e 's/^FROM.*/FROM codellama:latest/'  > Modelfile
$ echo 'TEMPLATE """'"$(ollama show --template llama3.1)"'"""' >> Modelfile
$ ollama create codellama-tool
<!-- gh-comment-id:2380305105 --> @rick-github commented on GitHub (Sep 28, 2024): Any model can use tools with an appropriate [system prompt](https://github.com/ollama/ollama/issues/5793#issuecomment-2238938100). However, if the model hasn't been trained for tool use, [results will vary](https://github.com/ollama/ollama/issues/6061#issuecomment-2257137350). Now that ollama supports native tool use, you could also modify the TEMPLATE to handle tools during query processing. Since codellama is based on llama, it may be as simple as taking a [tool enabled template](https://ollama.com/library/llama3.1/blobs/948af2743fc7) from a llama model and inserting it into the Modelfile of a copy of codellama (probably not, though): ```console $ ollama show --modelfile codellama:latest | sed -e '/TEMPLATE.*/,/"/d' -e 's/^FROM.*/FROM codellama:latest/' > Modelfile $ echo 'TEMPLATE """'"$(ollama show --template llama3.1)"'"""' >> Modelfile $ ollama create codellama-tool ```
Author
Owner

@masterwishx commented on GitHub (Oct 7, 2024):

Any model can use tools with an appropriate system prompt. However, if the model hasn't been trained for tool use, results will vary. Now that ollama supports native tool use, you could also modify the TEMPLATE to handle tools during query processing. Since codellama is based on llama, it may be as simple as taking a tool enabled template from a llama model and inserting it into the Modelfile of a copy of codellama (probably not, though):

$ ollama show --modelfile codellama:latest | sed -e '/TEMPLATE.*/,/"/d' -e 's/^FROM.*/FROM codellama:latest/'  > Modelfile
$ echo 'TEMPLATE """'"$(ollama show --template llama3.1)"'"""' >> Modelfile
$ ollama create codellama-tool

this commands should work if i got error :
400 codellama:latest does not support tools when trying to use ollama with [claude-dev] in vscode

or there is other way ? i dont have much experience with ai , sorry

<!-- gh-comment-id:2396110263 --> @masterwishx commented on GitHub (Oct 7, 2024): > Any model can use tools with an appropriate [system prompt](https://github.com/ollama/ollama/issues/5793#issuecomment-2238938100). However, if the model hasn't been trained for tool use, [results will vary](https://github.com/ollama/ollama/issues/6061#issuecomment-2257137350). Now that ollama supports native tool use, you could also modify the TEMPLATE to handle tools during query processing. Since codellama is based on llama, it may be as simple as taking a [tool enabled template](https://ollama.com/library/llama3.1/blobs/948af2743fc7) from a llama model and inserting it into the Modelfile of a copy of codellama (probably not, though): > > ``` > $ ollama show --modelfile codellama:latest | sed -e '/TEMPLATE.*/,/"/d' -e 's/^FROM.*/FROM codellama:latest/' > Modelfile > $ echo 'TEMPLATE """'"$(ollama show --template llama3.1)"'"""' >> Modelfile > $ ollama create codellama-tool > ``` this commands should work if i got error : `400 codellama:latest does not support tools` when trying to use ollama with [claude-dev] in vscode or there is other way ? i dont have much experience with ai , sorry
Author
Owner

@rick-github commented on GitHub (Oct 14, 2024):

You need to use the name of the new model, codellama-tool.

However, I had another look at this and using the llama3.1 template in codellama doesn't work. I played around a bit and wrote this template:

TEMPLATE """[INST]
{{- if .Messages }} <<SYS>>
{{- .System }}
{{- if .Tools }}
You have tool calling capabilities.

When you receive a tool call response, use the output to formulate an answer to the orginal user question.

Given the following tools, please respond with a JSON for a tool call with its proper arguments that best answers the given prompt.

Respond in the format {"name": tool name, "parameters": dictionary of argument name and its value}. Do not use variables.

These are the tools available to you:

{{ range $.Tools }}
{{- . }}
{{ end }}

{{- $response := 0 }}
{{- range $i, $_ := .Messages }}
{{- if or (and (eq .Role "assistant") .ToolCalls) (eq .Role "tool") }}{{ $response = 1 }}{{ end }}
{{- end }}

{{- if $response }}

Here are the previous tool calls and responses:

{{- range $i, $_ := .Messages }}
{{- if and (eq .Role "assistant") .ToolCalls }}
<tool_call>
{{- range .ToolCalls }}
{"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}}
{{ end -}}
</tool_call>
{{ else if eq .Role "tool" }}
<tool_response>
{"result": {{ .Content }}}
</tool_response>
{{ end }}
{{ end }}
{{ end }}

{{- end -}}
<</SYS>>

{{ $prompt := .Prompt }}
{{- range $i, $_ := .Messages }}
{{- if eq .Role "user" }}{{ $prompt = .Content }}{{ end }}
{{- end }}

{{ $prompt }}
{{- else }} <<SYS>>{{ .System }}<</SYS>>

{{ .Prompt }}
{{ end }}
[/INST]
"""

This mostly works, but because codellama is not a trained tool user, it sometimes doesn't use a tool or uses it too many times:

$ ./tool-test.py --model codellama-tools --prompt 'what is the time?'
calling get_datetime({})
calling get_datetime({})

The current time is 10:54 AM.
$ ./tool-test.py --model codellama-tools --prompt 'what is the time?'
calling get_datetime({})
The current time is 10:54.
$ ./tool-test.py --model codellama-tools --prompt 'what is the time?'
The time is 16:35.
$ ./tool-test.py --model codellama-tools --prompt 'what is the time?'
calling get_datetime({})

The current time is 10:55 in 24 hour format.
$ ./tool-test.py --model codellama-tools --prompt 'write a go function that prints the numbers 1 to 100'
```
package main

import "fmt"

func main() {
	for i := 1; i <= 100; i++ {
		fmt.Println(i)
	}
}
```
This function uses a `for` loop to iterate from 1 to 100, and prints each number on a new line using the `fmt.Println()` function.

Note that the template is written as a one-shot completion, it will not function as a chat bot.

<!-- gh-comment-id:2410544862 --> @rick-github commented on GitHub (Oct 14, 2024): You need to use the name of the new model, `codellama-tool`. However, I had another look at this and using the llama3.1 template in codellama doesn't work. I played around a bit and wrote this template: ```modelfile TEMPLATE """[INST] {{- if .Messages }} <<SYS>> {{- .System }} {{- if .Tools }} You have tool calling capabilities. When you receive a tool call response, use the output to formulate an answer to the orginal user question. Given the following tools, please respond with a JSON for a tool call with its proper arguments that best answers the given prompt. Respond in the format {"name": tool name, "parameters": dictionary of argument name and its value}. Do not use variables. These are the tools available to you: {{ range $.Tools }} {{- . }} {{ end }} {{- $response := 0 }} {{- range $i, $_ := .Messages }} {{- if or (and (eq .Role "assistant") .ToolCalls) (eq .Role "tool") }}{{ $response = 1 }}{{ end }} {{- end }} {{- if $response }} Here are the previous tool calls and responses: {{- range $i, $_ := .Messages }} {{- if and (eq .Role "assistant") .ToolCalls }} <tool_call> {{- range .ToolCalls }} {"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}} {{ end -}} </tool_call> {{ else if eq .Role "tool" }} <tool_response> {"result": {{ .Content }}} </tool_response> {{ end }} {{ end }} {{ end }} {{- end -}} <</SYS>> {{ $prompt := .Prompt }} {{- range $i, $_ := .Messages }} {{- if eq .Role "user" }}{{ $prompt = .Content }}{{ end }} {{- end }} {{ $prompt }} {{- else }} <<SYS>>{{ .System }}<</SYS>> {{ .Prompt }} {{ end }} [/INST] """ ``` This mostly works, but because codellama is not a trained tool user, it sometimes doesn't use a tool or uses it too many times: ````console $ ./tool-test.py --model codellama-tools --prompt 'what is the time?' calling get_datetime({}) calling get_datetime({}) The current time is 10:54 AM. $ ./tool-test.py --model codellama-tools --prompt 'what is the time?' calling get_datetime({}) The current time is 10:54. $ ./tool-test.py --model codellama-tools --prompt 'what is the time?' The time is 16:35. $ ./tool-test.py --model codellama-tools --prompt 'what is the time?' calling get_datetime({}) The current time is 10:55 in 24 hour format. $ ./tool-test.py --model codellama-tools --prompt 'write a go function that prints the numbers 1 to 100' ``` package main import "fmt" func main() { for i := 1; i <= 100; i++ { fmt.Println(i) } } ``` This function uses a `for` loop to iterate from 1 to 100, and prints each number on a new line using the `fmt.Println()` function. ```` Note that the template is written as a one-shot completion, it will not function as a chat bot.
Author
Owner

@masterwishx commented on GitHub (Oct 14, 2024):

You need to use the name of the new model, codellama-tool.

However, I had another look at this and using the llama3.1 template in codellama doesn't work. I played around a bit and wrote this template:

TEMPLATE """[INST]
{{- if .Messages }} <<SYS>>
{{- .System }}
{{- if .Tools }}
You have tool calling capabilities.

When you receive a tool call response, use the output to formulate an answer to the orginal user question.

Given the following tools, please respond with a JSON for a tool call with its proper arguments that best answers the given prompt.

Respond in the format {"name": tool name, "parameters": dictionary of argument name and its value}. Do not use variables.

These are the tools available to you:

{{ range $.Tools }}
{{- . }}
{{ end }}

{{- $response := 0 }}
{{- range $i, $_ := .Messages }}
{{- if or (and (eq .Role "assistant") .ToolCalls) (eq .Role "tool") }}{{ $response = 1 }}{{ end }}
{{- end }}

{{- if $response }}

Here are the previous tool calls and responses:

{{- range $i, $_ := .Messages }}
{{- if and (eq .Role "assistant") .ToolCalls }}
<tool_call>
{{- range .ToolCalls }}
{"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}}
{{ end -}}
</tool_call>
{{ else if eq .Role "tool" }}
<tool_response>
{"result": {{ .Content }}}
</tool_response>
{{ end }}
{{ end }}
{{ end }}

{{- end -}}
<</SYS>>

{{ $prompt := .Prompt }}
{{- range $i, $_ := .Messages }}
{{- if eq .Role "user" }}{{ $prompt = .Content }}{{ end }}
{{- end }}

{{ $prompt }}
{{- else }} <<SYS>>{{ .System }}<</SYS>>

{{ .Prompt }}
{{ end }}
[/INST]
"""

This mostly works, but because codellama is not a trained tool user, it sometimes doesn't use a tool or uses it too many times:

$ ./tool-test.py --model codellama-tools --prompt 'what is the time?'
calling get_datetime({})
calling get_datetime({})

The current time is 10:54 AM.
$ ./tool-test.py --model codellama-tools --prompt 'what is the time?'
calling get_datetime({})
The current time is 10:54.
$ ./tool-test.py --model codellama-tools --prompt 'what is the time?'
The time is 16:35.
$ ./tool-test.py --model codellama-tools --prompt 'what is the time?'
calling get_datetime({})

The current time is 10:55 in 24 hour format.
$ ./tool-test.py --model codellama-tools --prompt 'write a go function that prints the numbers 1 to 100'

package main

import "fmt"

func main() {
for i := 1; i <= 100; i++ {
fmt.Println(i)
}
}

This function uses a `for` loop to iterate from 1 to 100, and prints each number on a new line using the `fmt.Println()` function.

Note that the template is written as a one-shot completion, it will not function as a chat bot.

i found another tool for vscode that working fine with ollama - Continue

<!-- gh-comment-id:2410569163 --> @masterwishx commented on GitHub (Oct 14, 2024): > You need to use the name of the new model, `codellama-tool`. > > However, I had another look at this and using the llama3.1 template in codellama doesn't work. I played around a bit and wrote this template: > > ``` > TEMPLATE """[INST] > {{- if .Messages }} <<SYS>> > {{- .System }} > {{- if .Tools }} > You have tool calling capabilities. > > When you receive a tool call response, use the output to formulate an answer to the orginal user question. > > Given the following tools, please respond with a JSON for a tool call with its proper arguments that best answers the given prompt. > > Respond in the format {"name": tool name, "parameters": dictionary of argument name and its value}. Do not use variables. > > These are the tools available to you: > > {{ range $.Tools }} > {{- . }} > {{ end }} > > {{- $response := 0 }} > {{- range $i, $_ := .Messages }} > {{- if or (and (eq .Role "assistant") .ToolCalls) (eq .Role "tool") }}{{ $response = 1 }}{{ end }} > {{- end }} > > {{- if $response }} > > Here are the previous tool calls and responses: > > {{- range $i, $_ := .Messages }} > {{- if and (eq .Role "assistant") .ToolCalls }} > <tool_call> > {{- range .ToolCalls }} > {"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}} > {{ end -}} > </tool_call> > {{ else if eq .Role "tool" }} > <tool_response> > {"result": {{ .Content }}} > </tool_response> > {{ end }} > {{ end }} > {{ end }} > > {{- end -}} > <</SYS>> > > {{ $prompt := .Prompt }} > {{- range $i, $_ := .Messages }} > {{- if eq .Role "user" }}{{ $prompt = .Content }}{{ end }} > {{- end }} > > {{ $prompt }} > {{- else }} <<SYS>>{{ .System }}<</SYS>> > > {{ .Prompt }} > {{ end }} > [/INST] > """ > ``` > > This mostly works, but because codellama is not a trained tool user, it sometimes doesn't use a tool or uses it too many times: > > ``` > $ ./tool-test.py --model codellama-tools --prompt 'what is the time?' > calling get_datetime({}) > calling get_datetime({}) > > The current time is 10:54 AM. > $ ./tool-test.py --model codellama-tools --prompt 'what is the time?' > calling get_datetime({}) > The current time is 10:54. > $ ./tool-test.py --model codellama-tools --prompt 'what is the time?' > The time is 16:35. > $ ./tool-test.py --model codellama-tools --prompt 'what is the time?' > calling get_datetime({}) > > The current time is 10:55 in 24 hour format. > $ ./tool-test.py --model codellama-tools --prompt 'write a go function that prints the numbers 1 to 100' > ``` > package main > > import "fmt" > > func main() { > for i := 1; i <= 100; i++ { > fmt.Println(i) > } > } > ``` > This function uses a `for` loop to iterate from 1 to 100, and prints each number on a new line using the `fmt.Println()` function. > ``` > > Note that the template is written as a one-shot completion, it will not function as a chat bot. i found another tool for vscode that working fine with ollama - Continue
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66492