[PR #13689] envconfig: add OLLAMA_DEFAULT_THINK for server-wide thinking control #19611

Open
opened 2026-04-16 07:11:42 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/13689
Author: @beyhanmeyrali
Created: 1/12/2026
Status: 🔄 Open

Base: mainHead: feature/default-think-env


📝 Commits (1)

  • e7a8504 envconfig: add OLLAMA_DEFAULT_THINK for server-wide thinking control

📊 Changes

6 files changed (+95 additions, -37 deletions)

View changed files

📝 app/ui/app/src/components/ChatForm.tsx (+3 -1)
📝 app/ui/app/src/components/Settings.tsx (+31 -0)
📝 cmd/cmd.go (+4 -34)
📝 docs/faq.mdx (+20 -0)
📝 envconfig/config.go (+4 -0)
📝 server/routes.go (+33 -2)

📄 Description

Summary

Add support for OLLAMA_DEFAULT_THINK environment variable to control default thinking behavior for reasoning models (like Qwen3) at the server level.

  • Add OLLAMA_DEFAULT_THINK to envconfig (supports true/false/high/medium/low)
  • Update server routes to apply default when think is not specified in request
  • Fix CLI to not override server default (return nil instead of forcing true)
  • Add UI toggle in Settings page for default thinking preference
  • Enable think toggle for Qwen3 models in chat interface
  • Add documentation in FAQ

Use Case

Users running Ollama as a backend for automation tools (like n8n) cannot easily disable thinking mode per-request. This allows server-wide configuration:

OLLAMA_DEFAULT_THINK=false ollama serve

Users can still override per-request via API think parameter or CLI --think flag.

Test Plan

  • Server respects OLLAMA_DEFAULT_THINK=false (no thinking in API response)
  • CLI respects server default (no thinking with ollama run)
  • Explicit --think=true overrides server default
  • UI toggle works for Qwen3 models
  • Documentation added to FAQ

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/13689 **Author:** [@beyhanmeyrali](https://github.com/beyhanmeyrali) **Created:** 1/12/2026 **Status:** 🔄 Open **Base:** `main` ← **Head:** `feature/default-think-env` --- ### 📝 Commits (1) - [`e7a8504`](https://github.com/ollama/ollama/commit/e7a8504d0b1c0a7edac2aa9466a8acc885568fb3) envconfig: add OLLAMA_DEFAULT_THINK for server-wide thinking control ### 📊 Changes **6 files changed** (+95 additions, -37 deletions) <details> <summary>View changed files</summary> 📝 `app/ui/app/src/components/ChatForm.tsx` (+3 -1) 📝 `app/ui/app/src/components/Settings.tsx` (+31 -0) 📝 `cmd/cmd.go` (+4 -34) 📝 `docs/faq.mdx` (+20 -0) 📝 `envconfig/config.go` (+4 -0) 📝 `server/routes.go` (+33 -2) </details> ### 📄 Description ## Summary Add support for `OLLAMA_DEFAULT_THINK` environment variable to control default thinking behavior for reasoning models (like Qwen3) at the server level. - Add `OLLAMA_DEFAULT_THINK` to envconfig (supports `true`/`false`/`high`/`medium`/`low`) - Update server routes to apply default when `think` is not specified in request - Fix CLI to not override server default (return nil instead of forcing true) - Add UI toggle in Settings page for default thinking preference - Enable think toggle for Qwen3 models in chat interface - Add documentation in FAQ ## Use Case Users running Ollama as a backend for automation tools (like n8n) cannot easily disable thinking mode per-request. This allows server-wide configuration: ```bash OLLAMA_DEFAULT_THINK=false ollama serve ``` Users can still override per-request via API `think` parameter or CLI `--think` flag. ## Test Plan - [x] Server respects `OLLAMA_DEFAULT_THINK=false` (no thinking in API response) - [x] CLI respects server default (no thinking with `ollama run`) - [x] Explicit `--think=true` overrides server default - [x] UI toggle works for Qwen3 models - [x] Documentation added to FAQ --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-16 07:11:42 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#19611