[GH-ISSUE #14202] "error":"Invalid role: thinking" when using cloud gemini-3-pro-preview:latest #9254

Open
opened 2026-04-12 22:07:51 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @eticolat on GitHub (Feb 11, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14202

What is the issue?

using continue dev extension in vscode, in chat or plan mode:

  1. submit your first prompt complex enough so the model answers with a think section
  2. submit your second prompt, the first prompt and answer (including the thinking) are included in the second prompt as context. A "thinking" role (in addition to system, user, assistant) is automatically added to the prompt with the first answer thoughts from the model.

The error occurs: Error HTTP 400 Bad Request from http://{ollama server url}/api/chat {"StatusCode":400,"Status":"400 Bad Request","error":"Invalid role: thinking"}

Same process using kimi-k2.5:cloud for the second prompt (exact same context including the thinking role): the request is properly answered by the model. No error.

Relevant log output

system
   system prompt ...
thinking
   thoughts from first model answer ...

assistant
   Hello I am ready to assist you ...
user
   second user prompt

ErrorHTTP 400 Bad Request from http://ollamaServerURL/api/chat {"StatusCode":400,"Status":"400 Bad Request","error":"Invalid role: thinking"}

OS

Docker

GPU

No response

CPU

No response

Ollama version

0.15.6

Originally created by @eticolat on GitHub (Feb 11, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14202 ### What is the issue? using continue dev extension in vscode, in chat or plan mode: 1. submit your first prompt complex enough so the model answers with a think section 2. submit your second prompt, the first prompt and answer (including the thinking) are included in the second prompt as context. A "thinking" role (in addition to system, user, assistant) is automatically added to the prompt with the first answer thoughts from the model. The error occurs: Error HTTP 400 Bad Request from http://{ollama server url}/api/chat {"StatusCode":400,"Status":"400 Bad Request","error":"Invalid role: thinking"} Same process using kimi-k2.5:cloud for the second prompt (exact same context including the thinking role): the request is properly answered by the model. No error. ### Relevant log output ```shell system system prompt ... thinking thoughts from first model answer ... assistant Hello I am ready to assist you ... user second user prompt ErrorHTTP 400 Bad Request from http://ollamaServerURL/api/chat {"StatusCode":400,"Status":"400 Bad Request","error":"Invalid role: thinking"} ``` ### OS Docker ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.15.6
GiteaMirror added the bug label 2026-04-12 22:07:51 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#9254