[GH-ISSUE #14614] /set think {high|medium|low|true|false} in the TUI crashes qwen3.5 models #9470

Open
opened 2026-04-12 22:23:50 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @sammyf on GitHub (Mar 4, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14614

What is the issue?

setting think to any of the accepted values returns first an error and eventually crashes the application with a 400 Bad Request

>>> /set think false
Set 'think' mode to 'false'.
>>> hey there
error: 400 Bad Request: invalid think value: "false" (must be "high", "medium", "low", true, or false)
>>> /set think high
Set 'think' mode to 'high'.
>>> hey there
Error: 400 Bad Request: think value "high" is not supported for this model
10:47:16 sammy@neo-bandito ~$

this seems to happen with all official versions of qwen3.5

Relevant log output

there are no relevant logs, as the error seems to happen in the TUI

 
Mar 04 10:47:15 neo-bandito ollama[1594]: [GIN] 2026/03/04 - 10:47:15 | 400 |  146.015127ms |       127.0.0.1 | POST     "/api/chat"
Mar 04 10:47:15 neo-bandito ollama[1594]: [GIN] 2026/03/04 - 10:47:15 | 400 |  146.015127ms |       127.0.0.1 | POST     "/api/chat"

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

ollama version is 0.17.5

Originally created by @sammyf on GitHub (Mar 4, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14614 ### What is the issue? setting `think` to any of the accepted values returns first an error and eventually crashes the application with a 400 Bad Request ``` >>> /set think false Set 'think' mode to 'false'. >>> hey there error: 400 Bad Request: invalid think value: "false" (must be "high", "medium", "low", true, or false) >>> /set think high Set 'think' mode to 'high'. >>> hey there Error: 400 Bad Request: think value "high" is not supported for this model 10:47:16 sammy@neo-bandito ~$ ``` this seems to happen with all official versions of qwen3.5 ### Relevant log output ```shell there are no relevant logs, as the error seems to happen in the TUI Mar 04 10:47:15 neo-bandito ollama[1594]: [GIN] 2026/03/04 - 10:47:15 | 400 | 146.015127ms | 127.0.0.1 | POST "/api/chat" Mar 04 10:47:15 neo-bandito ollama[1594]: [GIN] 2026/03/04 - 10:47:15 | 400 | 146.015127ms | 127.0.0.1 | POST "/api/chat" ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version ollama version is 0.17.5
GiteaMirror added the bug label 2026-04-12 22:23:50 -05:00
Author
Owner

@fabianfiorotto commented on GitHub (Mar 4, 2026):

The commands for enabling and disabling thinking are

Enable thinking
/set think

Disable thinking
/set nothink

<!-- gh-comment-id:3999646799 --> @fabianfiorotto commented on GitHub (Mar 4, 2026): The commands for enabling and disabling thinking are **Enable thinking** /set think **Disable thinking** /set nothink
Author
Owner

@lennarkivistik commented on GitHub (Mar 5, 2026):

The thinking parameter from the api has worked great for me but i have only seen
true and false as the only viable options or atleast that is how I interpret the huggingface docs at Qwen3.5:9b

so "high", "medium", "low" is most probably only usefull for gpt-oss models

<!-- gh-comment-id:4003534582 --> @lennarkivistik commented on GitHub (Mar 5, 2026): The thinking parameter from the api has worked great for me but i have only seen true and false as the only viable options or atleast that is how I interpret the huggingface docs at [Qwen3.5:9b](https://huggingface.co/Qwen/Qwen3.5-9B#instruct-or-non-thinking-mode) so "high", "medium", "low" is most probably only usefull for **gpt-oss** models
Author
Owner

@liaoweiguo commented on GitHub (Mar 8, 2026):

how to set think low in curl

<!-- gh-comment-id:4018298332 --> @liaoweiguo commented on GitHub (Mar 8, 2026): how to set think low in curl
Author
Owner

@lennarkivistik commented on GitHub (Mar 8, 2026):

for Ollama, set think to "low" for gpt-oss:20b. Ollama’s docs say GPT-OSS expects think to be one of "low", "medium", or "high" rather than true/false.

curl http://localhost:11434/api/chat \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-oss:20b",
    "messages": [
      {
        "role": "user",
        "content": "Write a short product description for a compact mechanical keyboard."
      }
    ],
    "think": "low",
    "stream": false
  }'

Whilst qwen3.5 expects true and false

<!-- gh-comment-id:4018525435 --> @lennarkivistik commented on GitHub (Mar 8, 2026): for Ollama, set think to "low" for gpt-oss:20b. Ollama’s docs say GPT-OSS expects think to be one of "low", "medium", or "high" rather than true/false. ```` curl http://localhost:11434/api/chat \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-oss:20b", "messages": [ { "role": "user", "content": "Write a short product description for a compact mechanical keyboard." } ], "think": "low", "stream": false }' ```` Whilst qwen3.5 expects true and false
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#9470