[GH-ISSUE #10879] codestral doesn't allow tool calling #69208

Closed
opened 2026-05-04 17:28:37 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @nickkaltner on GitHub (May 27, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10879

What is the issue?

when i use codestral, the chat endpoint works fine with model and messages parameters. When i add the tools parameter, it 400s.
I'm using the elixir ollama library, and it gives
%Ollama.HTTPError{status: 400, message: "Bad Request"}

here is the example script

model = "codestral:22b-v0.1-q2_K"

stock_price_tool = %{
  type: "function",
  function: %{
    name: "get_stock_price",
    description: "Fetches the live stock price for the given ticker.",
    parameters: %{
      type: "object",
      properties: %{
        ticker: %{
          type: "string",
          description: "The ticker symbol of a specific stock."
        }
      },
      required: ["ticker"]
    }
  }
}

messages = [
  %{role: "system", content: "You are a helpful assistant."},
  %{role: "user", content: "what is apple's latest stock price?"}
]

{:ok, resp} = Ollama.chat(client,
  model: model,
  messages: messages,
  tools: [stock_price_tool],
  keep_alive: 10000
)

IO.puts inspect(resp)

IO.puts resp["message"]["content"]

resp

If I comment out the tools: [stock_price_tool] it works without the tool use

Relevant log output

[GIN] 2025/05/27 - 21:19:14 | 400 |   20.177917ms |       127.0.0.1 | POST     "/api/chat"
[GIN] 2025/05/27 - 21:19:29 | 400 |   18.747583ms |       127.0.0.1 | POST     "/api/chat"
[GIN] 2025/05/27 - 21:26:01 | 400 |   18.512958ms |       127.0.0.1 | POST     "/api/chat"

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.7.1

Originally created by @nickkaltner on GitHub (May 27, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10879 ### What is the issue? when i use codestral, the chat endpoint works fine with model and messages parameters. When i add the tools parameter, it 400s. I'm using the elixir ollama library, and it gives %Ollama.HTTPError{status: 400, message: "Bad Request"} here is the example script ```elixir model = "codestral:22b-v0.1-q2_K" stock_price_tool = %{ type: "function", function: %{ name: "get_stock_price", description: "Fetches the live stock price for the given ticker.", parameters: %{ type: "object", properties: %{ ticker: %{ type: "string", description: "The ticker symbol of a specific stock." } }, required: ["ticker"] } } } messages = [ %{role: "system", content: "You are a helpful assistant."}, %{role: "user", content: "what is apple's latest stock price?"} ] {:ok, resp} = Ollama.chat(client, model: model, messages: messages, tools: [stock_price_tool], keep_alive: 10000 ) IO.puts inspect(resp) IO.puts resp["message"]["content"] resp ``` If I comment out the tools: [stock_price_tool] it works without the tool use ### Relevant log output ```shell [GIN] 2025/05/27 - 21:19:14 | 400 | 20.177917ms | 127.0.0.1 | POST "/api/chat" [GIN] 2025/05/27 - 21:19:29 | 400 | 18.747583ms | 127.0.0.1 | POST "/api/chat" [GIN] 2025/05/27 - 21:26:01 | 400 | 18.512958ms | 127.0.0.1 | POST "/api/chat" ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.7.1
GiteaMirror added the bug label 2026-05-04 17:28:37 -05:00
Author
Owner

@arturo-air commented on GitHub (May 27, 2025):

If you want to make tools calls, you should use models marked as tools. You probably want to try devstral, it is also from Mistral and it allows tools.

<!-- gh-comment-id:2912270527 --> @arturo-air commented on GitHub (May 27, 2025): If you want to make `tools` calls, you should use models [marked as tools](https://ollama.com/search?c=tools). You probably want to try [devstral](https://ollama.com/library/devstral), it is also from Mistral and it allows `tools`.
Author
Owner

@nickkaltner commented on GitHub (May 27, 2025):

I guess that's my confusion, why it's not marked as a tool. From the codestral page;

"Code generation is one of the most popular LLM use-cases, so we are really excited about the Codestral release. From our initial testing, it's a great option for code generation workflows because it's fast, has favorable context window, and the instruct version supports tool use. We tested with LangGraph for self-corrective code generation using the instruct Codestral tool use for output, and it worked really well out-of-the-box (see our video detailing this)."

-- Harrison Chase, CEO and co-founder of LangChain

https://mistral.ai/news/codestral

<!-- gh-comment-id:2914286721 --> @nickkaltner commented on GitHub (May 27, 2025): I guess that's my confusion, why it's not marked as a tool. From the codestral page; "Code generation is one of the most popular LLM use-cases, so we are really excited about the Codestral release. From our initial testing, it's a great option for code generation workflows because it's fast, has favorable context window, and the instruct version supports tool use. We tested with LangGraph for self-corrective code generation using the instruct Codestral tool use for output, and it worked really well out-of-the-box (see our [video detailing this](https://youtu.be/zXFxmI9f06M))." -- Harrison Chase, CEO and co-founder of LangChain https://mistral.ai/news/codestral
Author
Owner

@nickkaltner commented on GitHub (May 27, 2025):

(thanks for the devstral tip)

<!-- gh-comment-id:2914289013 --> @nickkaltner commented on GitHub (May 27, 2025): (thanks for the devstral tip)
Author
Owner

@rick-github commented on GitHub (Jun 10, 2025):

The chat template for the original model doesn't include tool calling, which is likely why the ollama model doesn't have it. However, there are special tokens that are used for tool parsing, so somebody industrious enough could modify the template to add tool calling.

<!-- gh-comment-id:2958417542 --> @rick-github commented on GitHub (Jun 10, 2025): The [chat template](https://huggingface.co/mistralai/Codestral-22B-v0.1/blob/main/tokenizer_config.json#L6176) for the original model doesn't include tool calling, which is likely why the ollama model doesn't have it. However, there are [special tokens](https://huggingface.co/mistralai/Codestral-22B-v0.1/blob/main/tokenizer_config.json#L47) that are used for tool parsing, so somebody industrious enough could modify the template to add tool calling.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#69208