[GH-ISSUE #13968] Qwen2.5:14b output json tool call leak - INCORRECT TOOL CALL #55649

Closed
opened 2026-04-29 09:32:01 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @Arslan-Mehmood1 on GitHub (Jan 29, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/13968

What is the issue?

Hi, I'm using the default qwen2.5:14b Ollama variant (Q4_K_M), but tool calls in JSON format are leaking into the model’s normal response. It only works correctly occasionally.

Here’s an example agent response. It includes random Chinese characters and an invalid JSON tool call syntax:

agent_response =

侴
{"name": "start_legal_research", "arguments": {"search_intent": "Types d'organisations avec lesquelles l'État peut conclure des accords pour des contrats de relais pour adultes en France"}}
</tool_call>

Relevant log output

...

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.14.0

Originally created by @Arslan-Mehmood1 on GitHub (Jan 29, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/13968 ### What is the issue? Hi, I'm using the default qwen2.5:14b Ollama variant (Q4_K_M), but tool calls in JSON format are leaking into the model’s normal response. It only works correctly occasionally. Here’s an example agent response. It includes random Chinese characters and an invalid JSON tool call syntax: agent_response = ``` 侴 {"name": "start_legal_research", "arguments": {"search_intent": "Types d'organisations avec lesquelles l'État peut conclure des accords pour des contrats de relais pour adultes en France"}} </tool_call> ``` ### Relevant log output ```shell ... ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.14.0
GiteaMirror added the bugneeds more info labels 2026-04-29 09:32:03 -05:00
Author
Owner

@rick-github commented on GitHub (Jan 29, 2026):

Debugging will be easier if you provide code that duplicates the issue and server logs.

<!-- gh-comment-id:3817032859 --> @rick-github commented on GitHub (Jan 29, 2026): Debugging will be easier if you provide code that duplicates the issue and [server logs](https://docs.ollama.com/troubleshooting).
Author
Owner

@asherikov commented on GitHub (Feb 8, 2026):

Presumably I am having the same issue with qwen3-coder:30b and https://github.com/QwenLM/qwen-code -- tool calling pretty much never works for me. Also mentioned in https://github.com/ollama/ollama/issues/13093.

Using ollama/ollama:latest fac4832afb0c container (v0.15.2).
qwen-code:

  1. docker pull ghcr.io/qwenlm/qwen-code
  2. run container with
docker run --rm -ti \
-e "OPENAI_API_KEY=ollama" \
-e "OPENAI_BASE_URL=https://XXX/v1/" \
-e "OPENAI_MODEL=qwen3-coder:30b" \
ghcr.io/qwenlm/qwen-code
  1. type something like “list files in the current directory“ in the prompt, usually it returns broken tool calls:
    > list files in the current directory

    ✦ <function=run_shell_command>
      <parameter=command>
      ls -la
      </parameter>
      </function>
      </tool_call>

non-tool queries work.

<!-- gh-comment-id:3867762237 --> @asherikov commented on GitHub (Feb 8, 2026): Presumably I am having the same issue with qwen3-coder:30b and https://github.com/QwenLM/qwen-code -- tool calling pretty much never works for me. Also mentioned in https://github.com/ollama/ollama/issues/13093. Using ollama/ollama:latest fac4832afb0c container (v0.15.2). qwen-code: 1. docker pull ghcr.io/qwenlm/qwen-code 2. run container with ``` docker run --rm -ti \ -e "OPENAI_API_KEY=ollama" \ -e "OPENAI_BASE_URL=https://XXX/v1/" \ -e "OPENAI_MODEL=qwen3-coder:30b" \ ghcr.io/qwenlm/qwen-code ``` 3. type something like “list files in the current directory“ in the prompt, usually it returns broken tool calls: ``` > list files in the current directory ✦ <function=run_shell_command> <parameter=command> ls -la </parameter> </function> </tool_call> ``` non-tool queries work.
Author
Owner

@rick-github commented on GitHub (Feb 8, 2026):

What context size is the model configured with?

<!-- gh-comment-id:3867776104 --> @rick-github commented on GitHub (Feb 8, 2026): What context size is the model configured with?
Author
Owner

@asherikov commented on GitHub (Feb 8, 2026):

Thank you for the hint, in my case the issue is the context length, bumping to 64000 as suggested in https://docs.ollama.com/context-length does the job.

<!-- gh-comment-id:3867809916 --> @asherikov commented on GitHub (Feb 8, 2026): Thank you for the hint, in my case the issue is the context length, bumping to 64000 as suggested in https://docs.ollama.com/context-length does the job.
Author
Owner

@ninthhousestudios commented on GitHub (Apr 15, 2026):

Cross-link for tracking: #12174 reports the same symptom (bare JSON in content, tool_calls: null) on qwen2.5-coder:14b, reproducible on ollama 0.20.7. Same root cause looks likely — template expects <tool_call>...</tool_call> tags, model emits bare JSON without them. Posting here in case it helps maintainers see the pattern across the qwen2.5 family (base + coder).

<!-- gh-comment-id:4254002305 --> @ninthhousestudios commented on GitHub (Apr 15, 2026): Cross-link for tracking: #12174 reports the same symptom (bare JSON in `content`, `tool_calls: null`) on `qwen2.5-coder:14b`, reproducible on ollama 0.20.7. Same root cause looks likely — template expects `<tool_call>...</tool_call>` tags, model emits bare JSON without them. Posting here in case it helps maintainers see the pattern across the qwen2.5 family (base + coder).
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#55649