[GH-ISSUE #11991] Ollama gpt-oss:20b fail on tool call in Cline #54476

Open
opened 2026-04-29 06:04:11 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @Radrik5 on GitHub (Aug 20, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11991

Originally assigned to: @drifkin on GitHub.

What is the issue?

I'm using the most recent version of ollama 0.11.5 and gpt-oss:20b aa4295ac10c3 pulled today.

When I select Ollama or OpenAI Compatible provider in Cline (VS Code extension) and ask just "hi", the request fails with error "Unexpected API Response: The language model did not provide any assistant messages. This may indicate an issue with the API or the model's output."

It seems like the model decided to call one of the Cline's tools but ollama failed to parse it, log level=WARN source=harmonyparser.go:415 msg="harmony parser: no reverse mapping found for function name" harmonyFunctionName=ask_followup_question

Is there a fix or workaround?

Relevant log output

Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.655Z level=TRACE source=harmonyparser.go:308 msg="harmony event header complete" header="{Role:assistant Channel:commentary Recipient:ask_followup_question}"
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.670Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="{\"" from=[10848]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.670Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="{\"" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.685Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=question from=[14921]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.686Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=question state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.701Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\":\"" from=[7534]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.701Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\":\"" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.716Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=What from=[4827]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.716Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=What state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.731Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" would" from=[1481]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.731Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" would" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.746Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" you" from=[481]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.746Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" you" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.761Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" like" from=[1299]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.761Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" like" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.776Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" me" from=[668]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.776Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" me" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.791Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" to" from=[316]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.791Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" to" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.806Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" help" from=[1652]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.806Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" help" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.821Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" you" from=[481]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.821Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" you" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.836Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" with" from=[483]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.836Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" with" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.851Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" today" from=[4044]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.851Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" today" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.866Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=? from=[30]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.866Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=? state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.881Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\",\"" from=[4294]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.882Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\",\"" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.896Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=options from=[5805]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.897Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=options state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.912Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\":[\"" from=[95067]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.912Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\":[\"" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.927Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=Create from=[5104]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.927Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=Create state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.944Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" a" from=[261]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.944Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" a" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.960Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" new" from=[620]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.960Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" new" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.975Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" file" from=[1974]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.975Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" file" state=2
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.990Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\",\"" from=[4294]
Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.990Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\",\"" state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.005Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=Modify from=[45448]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.005Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=Modify state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.020Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" an" from=[448]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.020Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" an" state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.035Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" existing" from=[9595]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.035Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" existing" state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.050Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" file" from=[1974]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.050Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" file" state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.065Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\",\"" from=[4294]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.065Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\",\"" state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.080Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=Run from=[9050]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.080Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=Run state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.095Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" a" from=[261]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.095Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" a" state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.110Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" script" from=[11713]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.110Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" script" state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.125Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\",\"" from=[4294]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.125Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\",\"" state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.140Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=Other from=[13863]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.140Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=Other state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.155Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\"]" from=[2601]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.156Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\"]" state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.170Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=} from=[92]
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.171Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=} state=2
Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.186Z level=WARN source=harmonyparser.go:415 msg="harmony parser: no reverse mapping found for function name" harmonyFunctionName=ask_followup_question
Aug 20 15:10:08 localhost ollama[204562]: [GIN] 2025/08/20 - 15:10:08 | 200 | 14.020023668s |             ::1 | POST     "/v1/chat/completions"

Tool description in the prompt

## ask_followup_question
Description: Ask the user a question to gather additional information needed to complete the task. This tool should be used when you encounter ambiguities, need clarification, or require more details to proceed effectively. It allows for interactive problem-solving by enabling direct communication with the user. Use this tool judiciously to maintain a balance between gathering necessary information and avoiding excessive back-and-forth.
Parameters:
- question: (required) The question to ask the user. This should be a clear, specific question that addresses the information you need.
- options: (optional) An array of 2-5 options for the user to choose from. Each option should be a string describing a possible answer. You may not always need to provide options, but it may be helpful in many cases where it can save the user from having to type out a response manually. IMPORTANT: NEVER include an option to toggle to Act mode, as this would be something you need to direct the user to do manually themselves if needed.
Usage:
<ask_followup_question>
<question>Your question here</question>
<options>
Array of options here (optional), e.g. [\"Option 1\", \"Option 2\", \"Option 3\"]
</options>
</ask_followup_question>

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.11.5

Originally created by @Radrik5 on GitHub (Aug 20, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11991 Originally assigned to: @drifkin on GitHub. ### What is the issue? I'm using the most recent version of ollama 0.11.5 and gpt-oss:20b aa4295ac10c3 pulled today. When I select Ollama or OpenAI Compatible provider in Cline (VS Code extension) and ask just "hi", the request fails with error "Unexpected API Response: The language model did not provide any assistant messages. This may indicate an issue with the API or the model's output." It seems like the model decided to call one of the Cline's tools but ollama failed to parse it, log level=WARN source=harmonyparser.go:415 msg="harmony parser: no reverse mapping found for function name" harmonyFunctionName=ask_followup_question Is there a fix or workaround? ### Relevant log output ```shell Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.655Z level=TRACE source=harmonyparser.go:308 msg="harmony event header complete" header="{Role:assistant Channel:commentary Recipient:ask_followup_question}" Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.670Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="{\"" from=[10848] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.670Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="{\"" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.685Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=question from=[14921] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.686Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=question state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.701Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\":\"" from=[7534] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.701Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\":\"" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.716Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=What from=[4827] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.716Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=What state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.731Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" would" from=[1481] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.731Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" would" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.746Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" you" from=[481] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.746Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" you" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.761Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" like" from=[1299] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.761Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" like" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.776Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" me" from=[668] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.776Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" me" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.791Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" to" from=[316] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.791Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" to" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.806Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" help" from=[1652] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.806Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" help" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.821Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" you" from=[481] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.821Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" you" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.836Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" with" from=[483] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.836Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" with" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.851Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" today" from=[4044] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.851Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" today" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.866Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=? from=[30] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.866Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=? state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.881Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\",\"" from=[4294] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.882Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\",\"" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.896Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=options from=[5805] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.897Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=options state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.912Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\":[\"" from=[95067] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.912Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\":[\"" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.927Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=Create from=[5104] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.927Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=Create state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.944Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" a" from=[261] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.944Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" a" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.960Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" new" from=[620] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.960Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" new" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.975Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" file" from=[1974] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.975Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" file" state=2 Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.990Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\",\"" from=[4294] Aug 20 15:10:07 localhost ollama[204562]: time=2025-08-20T15:10:07.990Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\",\"" state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.005Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=Modify from=[45448] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.005Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=Modify state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.020Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" an" from=[448] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.020Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" an" state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.035Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" existing" from=[9595] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.035Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" existing" state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.050Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" file" from=[1974] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.050Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" file" state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.065Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\",\"" from=[4294] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.065Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\",\"" state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.080Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=Run from=[9050] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.080Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=Run state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.095Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" a" from=[261] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.095Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" a" state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.110Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=" script" from=[11713] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.110Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=" script" state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.125Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\",\"" from=[4294] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.125Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\",\"" state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.140Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=Other from=[13863] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.140Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=Other state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.155Z level=TRACE source=bytepairencoding.go:246 msg=decoded string="\"]" from=[2601] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.156Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content="\"]" state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.170Z level=TRACE source=bytepairencoding.go:246 msg=decoded string=} from=[92] Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.171Z level=TRACE source=harmonyparser.go:331 msg="harmony event content" content=} state=2 Aug 20 15:10:08 localhost ollama[204562]: time=2025-08-20T15:10:08.186Z level=WARN source=harmonyparser.go:415 msg="harmony parser: no reverse mapping found for function name" harmonyFunctionName=ask_followup_question Aug 20 15:10:08 localhost ollama[204562]: [GIN] 2025/08/20 - 15:10:08 | 200 | 14.020023668s | ::1 | POST "/v1/chat/completions" ``` ### Tool description in the prompt ``` ## ask_followup_question Description: Ask the user a question to gather additional information needed to complete the task. This tool should be used when you encounter ambiguities, need clarification, or require more details to proceed effectively. It allows for interactive problem-solving by enabling direct communication with the user. Use this tool judiciously to maintain a balance between gathering necessary information and avoiding excessive back-and-forth. Parameters: - question: (required) The question to ask the user. This should be a clear, specific question that addresses the information you need. - options: (optional) An array of 2-5 options for the user to choose from. Each option should be a string describing a possible answer. You may not always need to provide options, but it may be helpful in many cases where it can save the user from having to type out a response manually. IMPORTANT: NEVER include an option to toggle to Act mode, as this would be something you need to direct the user to do manually themselves if needed. Usage: <ask_followup_question> <question>Your question here</question> <options> Array of options here (optional), e.g. [\"Option 1\", \"Option 2\", \"Option 3\"] </options> </ask_followup_question> ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.11.5
GiteaMirror added the gpt-osstoolsbug labels 2026-04-29 06:04:12 -05:00
Author
Owner

@drifkin commented on GitHub (Aug 20, 2025):

Hi @Radrik5, thanks for the report, I'll take a look into it and see if I can repro as well (I suspect so, just haven't tried it yet). The log line you found about the reverse mapping is just a warning. It means we parsed it successfully, but we didn't see a tool definition passed in that matches it. Even in that case, we still return the tool call to the caller so that they can decide what to do with it.

From the snippet you've shown in the prompt, it looks like the function calls aren't being passed in and are instead just provided in the prompt directly? And my guess is it's hoping that these pseudo "tool calls" end up being part of the assistant's normal response and then get parsed by them? It seems like somehow cline isn't using tool calling in a "first class" way. Not sure if that's a configuration issue or the way that cline always works. gpt-oss is interesting in cases like this in that if you tell it to call functions, even if you don't provide them it is able to call them in the harmony format, which is my main suspicion of what's going on here

<!-- gh-comment-id:3207755210 --> @drifkin commented on GitHub (Aug 20, 2025): Hi @Radrik5, thanks for the report, I'll take a look into it and see if I can repro as well (I suspect so, just haven't tried it yet). The log line you found about the reverse mapping is just a warning. It means we parsed it successfully, but we didn't see a tool definition passed in that matches it. Even in that case, we still return the tool call to the caller so that they can decide what to do with it. From the snippet you've shown in the prompt, it looks like the function calls aren't being passed in and are instead just provided in the prompt directly? And my guess is it's hoping that these pseudo "tool calls" end up being part of the assistant's normal response and then get parsed by them? It seems like somehow cline isn't using tool calling in a "first class" way. Not sure if that's a configuration issue or the way that cline always works. `gpt-oss` is interesting in cases like this in that if you tell it to call functions, even if you don't provide them it is able to call them in the harmony format, which is my main suspicion of what's going on here
Author
Owner

@Radrik5 commented on GitHub (Aug 20, 2025):

Hi @drifkin, thanks for the reply! I didn't dig deeper into how Cline uses the API, it would be nice if ollama logged whole request at some level, not just a prompt. Cline has separate settings for Ollama provider and OpenAI Compatible provider but they return the same error for me. The same for Roo Code plugin, which is an advanced fork of Cline.

Previously I managed to make llama.cpp work with Cline and gpt-oss using the grammar file mentioned in this comment: https://github.com/ggml-org/llama.cpp/pull/15181#issuecomment-3196427393 But I like ollama and I would like to use it with coding agents in addition to Open WebUI.

<!-- gh-comment-id:3207897171 --> @Radrik5 commented on GitHub (Aug 20, 2025): Hi @drifkin, thanks for the reply! I didn't dig deeper into how Cline uses the API, it would be nice if ollama logged whole request at some level, not just a prompt. Cline has separate settings for Ollama provider and OpenAI Compatible provider but they return the same error for me. The same for Roo Code plugin, which is an advanced fork of Cline. Previously I managed to make llama.cpp work with Cline and gpt-oss using the grammar file mentioned in this comment: https://github.com/ggml-org/llama.cpp/pull/15181#issuecomment-3196427393 But I like ollama and I would like to use it with coding agents in addition to Open WebUI.
Author
Owner

@drifkin commented on GitHub (Aug 20, 2025):

ah I see, thanks for pointing me to that conversation. Yes, it looks like cline doesn't use "native" tool calling and instead uses its own tool calling conventions. And then gpt-oss is so good at tool calling that it understands cline is trying to make the equivalent of tool calls, and then it converts them to native tool calls. There might be a way to prompt out of this, but it'd be really great if cline optionally looked at "real" tool calls. I'll try to carve out some time to investigate more in depth soon

<!-- gh-comment-id:3208306870 --> @drifkin commented on GitHub (Aug 20, 2025): ah I see, thanks for pointing me to that conversation. Yes, it looks like cline doesn't use "native" tool calling and instead uses its own tool calling conventions. And then gpt-oss is so good at tool calling that it understands cline is trying to make the equivalent of tool calls, and then it converts them to native tool calls. There might be a way to prompt out of this, but it'd be really great if cline optionally looked at "real" tool calls. I'll try to carve out some time to investigate more in depth soon
Author
Owner

@mattans commented on GitHub (Sep 29, 2025):

I am also getting this with Ollama and gpt-oss:20b, without Cline, but with Semantic Kernel in an agent setting instead.
At first the function is called with the "Plugin-" prefix.
I don't know if it's Ollama-related, but after the tool call finishes, it re-runs the tool but removes the "Plugin-" from the tool's name.
Essentially running the same tool twice.

<!-- gh-comment-id:3346195911 --> @mattans commented on GitHub (Sep 29, 2025): I am also getting this with Ollama and gpt-oss:20b, without Cline, but with Semantic Kernel in an agent setting instead. At first the function is called with the "Plugin-" prefix. I don't know if it's Ollama-related, but after the tool call finishes, it re-runs the tool but removes the "Plugin-" from the tool's name. Essentially running the same tool twice.
Author
Owner

@gattytto commented on GitHub (Oct 2, 2025):

hello I want to add my 5 cents to this issue, I get this when using open-webui and using the search function built-in on open-webui and ollama_cloud as search provider. The problem does not show up if I disable the fetch tool and let open-webui internal search tool do the fetching

2025-10-02T21:57:42.988Z | time=2025-10-02T21:57:42.988Z level=WARN source=harmonyparser.go:482 msg="harmony parser: no reverse mapping found for function name" harmonyFunctionName=search

it's correctly calling the fetch tool to fetch search results, but I have provided the fetch tool as a separate MCP

Image Image
<!-- gh-comment-id:3363379465 --> @gattytto commented on GitHub (Oct 2, 2025): hello I want to add my 5 cents to this issue, I get this when using open-webui and using the search function built-in on open-webui and ollama_cloud as search provider. The problem does not show up if I disable the fetch tool and let open-webui internal search tool do the fetching ```verilog 2025-10-02T21:57:42.988Z | time=2025-10-02T21:57:42.988Z level=WARN source=harmonyparser.go:482 msg="harmony parser: no reverse mapping found for function name" harmonyFunctionName=search ``` it's correctly calling the fetch tool to fetch search results, but I have provided the fetch tool as a separate MCP <img width="296" height="364" alt="Image" src="https://github.com/user-attachments/assets/1b32375a-5579-4163-a8ac-20c736c77f3b" /> <img width="959" height="398" alt="Image" src="https://github.com/user-attachments/assets/a603cae5-ceb9-4cb4-b140-1de3784f603b" />
Author
Owner

@Hemanth21k commented on GitHub (Oct 20, 2025):

Hi, I wanted to add that the underlying issue must be due to the parsing of the OpenAI-Harmony format in which the GPT-OSS model has been trained. So far, I've tested Model Files of other thinking models such as DeepSeek-R1, Devstral to see what Cline is familiar with. But GPT-OSS Model File is much more different (and much harder to modify without breaking the model).

I managed to run GPT-OSS:20B inside cline by disabling its in-built tool calling (as @drifkin mentioned above that GPT-OSS might be recognizing Cline's tool call) and I also tried to remove its Chain of Thought (CoT) for now.

GPT-OSS_Cline_ModelFile:

#This ModelFile is modified to work inside Cline

FROM gpt-oss:20b

TEMPLATE """<|start|>system<|message|>You are ChatGPT, a large language model trained by OpenAI.
Knowledge cutoff: 2024-06
Current date: {{ currentDate }}

# Valid channels: final. Channel must be included for every message.
<|end|>
{{- /* end of system */ -}}

{{- if .System -}}
<|start|>developer<|message|>
# Instructions

{{ .System }}
<|end|>
{{- end -}}
{{- /* Check if the last message is an assistant message we are pre-filling */ -}}
{{- $prefillingContent := false }}
{{- range $i, $msg := .Messages }}
  {{- $last := eq (len (slice $.Messages $i)) 1 -}}
  {{- if and $last (eq $msg.Role "assistant") (gt (len $msg.Content) 0) }}
    {{- $prefillingContent = true }}
  {{- end }}
{{- end -}}
{{- /* Now render messages */ -}}
{{- range $i, $msg := .Messages }}
  {{- if eq $msg.Role "user" -}}
    <|start|>{{ $msg.Role }}<|message|>{{ $msg.Content }}<|end|>
  {{- else if eq $msg.Role "assistant" -}}
    {{- if gt (len $msg.Content) 0 -}}
      <|start|>assistant<|channel|>final<|message|>{{ $msg.Content }}{{- if not $prefillingContent -}}<|end|>{{- end -}}
    {{- end -}}
  {{- end }}
{{- end -}}
{{- if not $prefillingContent -}}
<|start|>assistant
{{- end -}}"""

PARAMETER temperature 0.7


LICENSE """
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/LICENSE-2.0
"""

You can create a new model which should be available inside Cline with this model file by running the following command:

$ ollama create gpt-oss-cline -f GPT-OSS_Cline_ModelFile

PS: Please let me know if you are able to improve this ModelFile to bring back the tool calling and thinking functionalities.

<!-- gh-comment-id:3423971686 --> @Hemanth21k commented on GitHub (Oct 20, 2025): Hi, I wanted to add that the underlying issue must be due to the parsing of the [OpenAI-Harmony](https://cookbook.openai.com/articles/openai-harmony#roles) format in which the GPT-OSS model has been trained. So far, I've tested Model Files of other thinking models such as DeepSeek-R1, Devstral to see what Cline is familiar with. But GPT-OSS Model File is much more different (and much harder to modify without breaking the model). I managed to run GPT-OSS:20B inside cline by disabling its in-built tool calling (as @drifkin mentioned above that GPT-OSS might be recognizing Cline's tool call) and I also tried to remove its Chain of Thought (CoT) for now. GPT-OSS_Cline_ModelFile: ``` #This ModelFile is modified to work inside Cline FROM gpt-oss:20b TEMPLATE """<|start|>system<|message|>You are ChatGPT, a large language model trained by OpenAI. Knowledge cutoff: 2024-06 Current date: {{ currentDate }} # Valid channels: final. Channel must be included for every message. <|end|> {{- /* end of system */ -}} {{- if .System -}} <|start|>developer<|message|> # Instructions {{ .System }} <|end|> {{- end -}} {{- /* Check if the last message is an assistant message we are pre-filling */ -}} {{- $prefillingContent := false }} {{- range $i, $msg := .Messages }}   {{- $last := eq (len (slice $.Messages $i)) 1 -}}   {{- if and $last (eq $msg.Role "assistant") (gt (len $msg.Content) 0) }}     {{- $prefillingContent = true }}   {{- end }} {{- end -}} {{- /* Now render messages */ -}} {{- range $i, $msg := .Messages }}   {{- if eq $msg.Role "user" -}}     <|start|>{{ $msg.Role }}<|message|>{{ $msg.Content }}<|end|>   {{- else if eq $msg.Role "assistant" -}}     {{- if gt (len $msg.Content) 0 -}}       <|start|>assistant<|channel|>final<|message|>{{ $msg.Content }}{{- if not $prefillingContent -}}<|end|>{{- end -}}     {{- end -}}   {{- end }} {{- end -}} {{- if not $prefillingContent -}} <|start|>assistant {{- end -}}""" PARAMETER temperature 0.7 LICENSE """ Apache License Version 2.0, January 2004 http://www.apache.org/licenses/LICENSE-2.0 """ ``` You can create a new model which should be available inside Cline with this model file by running the following command: ``` $ ollama create gpt-oss-cline -f GPT-OSS_Cline_ModelFile ``` PS: Please let me know if you are able to improve this ModelFile to bring back the tool calling and thinking functionalities.
Author
Owner

@benbenz commented on GitHub (Dec 3, 2025):

I am encountering the same issue in VS Code (Copilot interface setup with Ollama/OSS and an MCP server).
Somehow OSS was behaving completely fine yesterday: no warning and no error on the client side.
I updated my system today and I am seeing the same issue now.
I rolled back to Ollama 0.12.6 (from 0.13.0) and this did not fix anything.
I rolled back to the older version of VS Code (with Copilot Chat Extension 0.32.5 - VS Code version 1.105.0) and it seems to work fine again.
Copilot Chat Extension 0.33.3 / VS Code 1.106 seems to not work as well.
Could it be related to VS Code then ?

configuration:
Version: 1.105.0
Commit: 03c265b1adee71ac88f833e065f7bb956b60550a
Date: 2025-10-08T14:09:35.891Z
Electron: 37.6.0
ElectronBuildId: 12502201
Chromium: 138.0.7204.251
Node.js: 22.19.0
V8: 13.8.258.32-electron.0
OS: Linux x64 6.12.60
GitHub Chat Copilot: 0.32.5

<!-- gh-comment-id:3608079912 --> @benbenz commented on GitHub (Dec 3, 2025): I am encountering the same issue in VS Code (Copilot interface setup with Ollama/OSS and an MCP server). Somehow OSS was behaving completely fine yesterday: no warning and no error on the client side. I updated my system today and I am seeing the same issue now. I rolled back to Ollama 0.12.6 (from 0.13.0) and this did not fix anything. I rolled back to the older version of VS Code (with Copilot Chat Extension 0.32.5 - VS Code version 1.105.0) and it seems to work fine again. Copilot Chat Extension 0.33.3 / VS Code 1.106 seems to not work as well. Could it be related to VS Code then ? configuration: Version: 1.105.0 Commit: 03c265b1adee71ac88f833e065f7bb956b60550a Date: 2025-10-08T14:09:35.891Z Electron: 37.6.0 ElectronBuildId: 12502201 Chromium: 138.0.7204.251 Node.js: 22.19.0 V8: 13.8.258.32-electron.0 OS: Linux x64 6.12.60 GitHub Chat Copilot: 0.32.5
Author
Owner

@karoldydo commented on GitHub (Dec 7, 2025):

I will add two cents from me:

Docker

  • ollama/ollama:0.13.1-rocm

Model

MCPO Configuration

{
  "mcpServers": {
    "web-search-prime": {
      "type": "streamable-http",
      "url": "https://api.z.ai/api/mcp/web_search_prime/mcp",
      "headers": {
        "Authorization": "Bearer YOUR_Z_AI_TOKEN_HERE"
      }
    }
}

Chat

  • Ask model to provide something from internet

Logs

ollama  | [GIN] 2025/12/07 - 23:14:00 | 200 |     110.306µs |        10.0.0.1 | GET      "/api/ps"
ollama  | time=2025-12-07T23:14:19.663Z level=WARN source=harmonyparser.go:482 msg="harmony parser: no reverse mapping found for function name" harmonyFunctionName=tool_webSearchPrime_post
ollama  | [GIN] 2025/12/07 - 23:14:19 | 500 | 12.597030515s |        10.0.0.1 | POST     "/api/chat"
ollama  | [GIN] 2025/12/07 - 23:14:22 | 200 |  2.340397165s |        10.0.0.1 | POST     "/api/chat"
ollama  | [GIN] 2025/12/07 - 23:14:27 | 200 |   4.96259378s |        10.0.0.1 | POST     "/api/chat"
ollama  | [GIN] 2025/12/07 - 23:14:31 | 200 |    1.115679ms |        10.0.0.1 | GET      "/api/tags"
ollama  | [GIN] 2025/12/07 - 23:14:31 | 200 |      70.307µs |        10.0.0.1 | GET      "/api/ps"
ollama  | time=2025-12-07T23:14:45.417Z level=WARN source=harmonyparser.go:482 msg="harmony parser: no reverse mapping found for function name" harmonyFunctionName=tool_webSearchPrime_post
ollama  | [GIN] 2025/12/07 - 23:14:45 | 500 | 11.273481602s |        10.0.0.1 | POST     "/api/chat"
<!-- gh-comment-id:3623793232 --> @karoldydo commented on GitHub (Dec 7, 2025): I will add two cents from me: **Docker** - `ollama/ollama:0.13.1-rocm` **Model** - https://huggingface.co/unsloth/gpt-oss-20b-GGUF **MCPO Configuration** ```json { "mcpServers": { "web-search-prime": { "type": "streamable-http", "url": "https://api.z.ai/api/mcp/web_search_prime/mcp", "headers": { "Authorization": "Bearer YOUR_Z_AI_TOKEN_HERE" } } } ``` **Chat** - Ask model to provide something from internet **Logs** ```bash ollama | [GIN] 2025/12/07 - 23:14:00 | 200 | 110.306µs | 10.0.0.1 | GET "/api/ps" ollama | time=2025-12-07T23:14:19.663Z level=WARN source=harmonyparser.go:482 msg="harmony parser: no reverse mapping found for function name" harmonyFunctionName=tool_webSearchPrime_post ollama | [GIN] 2025/12/07 - 23:14:19 | 500 | 12.597030515s | 10.0.0.1 | POST "/api/chat" ollama | [GIN] 2025/12/07 - 23:14:22 | 200 | 2.340397165s | 10.0.0.1 | POST "/api/chat" ollama | [GIN] 2025/12/07 - 23:14:27 | 200 | 4.96259378s | 10.0.0.1 | POST "/api/chat" ollama | [GIN] 2025/12/07 - 23:14:31 | 200 | 1.115679ms | 10.0.0.1 | GET "/api/tags" ollama | [GIN] 2025/12/07 - 23:14:31 | 200 | 70.307µs | 10.0.0.1 | GET "/api/ps" ollama | time=2025-12-07T23:14:45.417Z level=WARN source=harmonyparser.go:482 msg="harmony parser: no reverse mapping found for function name" harmonyFunctionName=tool_webSearchPrime_post ollama | [GIN] 2025/12/07 - 23:14:45 | 500 | 11.273481602s | 10.0.0.1 | POST "/api/chat" ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#54476