[GH-ISSUE #12229] Native Tool Calling output broken after update to v0.11.10 #70197

Closed
opened 2026-05-04 20:38:15 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @camfirem on GitHub (Sep 9, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12229

What is the issue?

What is the issue?
After updating from Ollama v0.11.4 to v0.11.10, native tool calling no longer works correctly with certain models.

With Qwen3 models, enabling native tool calling results in the output:
3333333333333333333333333333333

With Llama 3.2 models, enabling native tool calling results in the output:
GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG

When unloading the models and running them without native tool calling, the tools work as expected.
This issue only occurs when native tool calling is active and has been reproducible since the update.

Environment:
Ollama v0.11.10 (worked fine in v0.11.4, not tried to downgrade yet)
OpenWebUI frontend v0.6.25
NVIDIA Jetson Orin Nano
Ubuntu 22.04 Jammy
Kernel: aarch64 Linux 5.15.148-tegra

Steps to reproduce:

  • Update Ollama from v0.11.4 to v0.11.10
  • Load a Qwen3 or Llama 3.2 model
  • Enable native tool calling
  • Run a request using tools
  • Observe incorrect repeated-character output

Expected behavior:
The model should correctly process tool calls instead of outputting repeated characters.

Relevant log output


OS

Linux

GPU

Nvidia

CPU

Other

Ollama version

0.11.10

Originally created by @camfirem on GitHub (Sep 9, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12229 ### What is the issue? What is the issue? After updating from Ollama v0.11.4 to v0.11.10, native tool calling no longer works correctly with certain models. With Qwen3 models, enabling native tool calling results in the output: 3333333333333333333333333333333 With Llama 3.2 models, enabling native tool calling results in the output: GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG When unloading the models and running them without native tool calling, the tools work as expected. This issue only occurs when native tool calling is active and has been reproducible since the update. Environment: Ollama v0.11.10 (worked fine in v0.11.4, not tried to downgrade yet) OpenWebUI frontend v0.6.25 NVIDIA Jetson Orin Nano Ubuntu 22.04 Jammy Kernel: aarch64 Linux 5.15.148-tegra Steps to reproduce: - Update Ollama from v0.11.4 to v0.11.10 - Load a Qwen3 or Llama 3.2 model - Enable native tool calling - Run a request using tools - Observe incorrect repeated-character output Expected behavior: The model should correctly process tool calls instead of outputting repeated characters. ### Relevant log output ```shell ``` ### OS Linux ### GPU Nvidia ### CPU Other ### Ollama version 0.11.10
GiteaMirror added the bug label 2026-05-04 20:38:15 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 9, 2025):

Seems to be an issue with the Jetson: #12209

<!-- gh-comment-id:3271046718 --> @rick-github commented on GitHub (Sep 9, 2025): Seems to be an issue with the Jetson: #12209
Author
Owner

@camfirem commented on GitHub (Sep 9, 2025):

Damn thats unfortunate. I just rolled Ollama back on v0.11.4. Not its working fine again. I guess I'll have to stay there for now.

<!-- gh-comment-id:3271119101 --> @camfirem commented on GitHub (Sep 9, 2025): Damn thats unfortunate. I just rolled Ollama back on v0.11.4. Not its working fine again. I guess I'll have to stay there for now.
Author
Owner

@pdevine commented on GitHub (Sep 9, 2025):

Going to close as a dupe.

<!-- gh-comment-id:3271519241 --> @pdevine commented on GitHub (Sep 9, 2025): Going to close as a dupe.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#70197