[PR #15036] [CLOSED] fix: forward think:true to deepseek-r1 via /v1/chat/completions #40853

Closed
opened 2026-04-23 01:39:06 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/15036
Author: @BillionClaw
Created: 3/24/2026
Status: Closed

Base: mainHead: fix/chat-completions-think-true


📝 Commits (1)

  • bc0fe32 fix: forward think:true to deepseek-r1 via /v1/chat/completions

📊 Changes

1 file changed (+18 additions, -13 deletions)

View changed files

📝 openai/openai.go (+18 -13)

📄 Description

Description

When a client sends think:true (boolean) via /v1/chat/completions, the value was silently ignored because ChatCompletionRequest had no field to capture it and toApiChatRequest only handled reasoning_effort (string effort values). This breaks deepseek-r1 which requires think=true to produce output.

Root Cause

In openai/openai.go, the ChatCompletionRequest struct lacked a Think *bool field, and the toApiChatRequest function only processed reasoning (effort string). The think parameter from the OpenAI API was being dropped.

Fix

  1. Added Think *bool json:"think,omitempty"field toChatCompletionRequest`
  2. Updated toApiChatRequest to forward the boolean think value to the internal api.ChatRequest when present, before falling back to the reasoning_effort logic

This mirrors how the native Ollama API handles think as a boolean (see api.ChatRequest and api.ThinkValue).

Testing

  • think:true is now forwarded to deepseek-r1 via the /v1/chat/completions endpoint
  • Existing reasoning_effort logic is preserved as a fallback

Fixes ollama/ollama#15029


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/15036 **Author:** [@BillionClaw](https://github.com/BillionClaw) **Created:** 3/24/2026 **Status:** ❌ Closed **Base:** `main` ← **Head:** `fix/chat-completions-think-true` --- ### 📝 Commits (1) - [`bc0fe32`](https://github.com/ollama/ollama/commit/bc0fe32968bd64e797de646c69cdaf57d730af0f) fix: forward think:true to deepseek-r1 via /v1/chat/completions ### 📊 Changes **1 file changed** (+18 additions, -13 deletions) <details> <summary>View changed files</summary> 📝 `openai/openai.go` (+18 -13) </details> ### 📄 Description ## Description When a client sends `think:true` (boolean) via `/v1/chat/completions`, the value was silently ignored because `ChatCompletionRequest` had no field to capture it and `toApiChatRequest` only handled `reasoning_effort` (string effort values). This breaks deepseek-r1 which requires `think=true` to produce output. ## Root Cause In `openai/openai.go`, the `ChatCompletionRequest` struct lacked a `Think *bool` field, and the `toApiChatRequest` function only processed `reasoning` (effort string). The `think` parameter from the OpenAI API was being dropped. ## Fix 1. Added `Think *bool `json:"think,omitempty"` field to `ChatCompletionRequest` 2. Updated `toApiChatRequest` to forward the boolean `think` value to the internal `api.ChatRequest` when present, before falling back to the `reasoning_effort` logic This mirrors how the native Ollama API handles `think` as a boolean (see `api.ChatRequest` and `api.ThinkValue`). ## Testing - `think:true` is now forwarded to deepseek-r1 via the `/v1/chat/completions` endpoint - Existing `reasoning_effort` logic is preserved as a fallback Fixes ollama/ollama#15029 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-23 01:39:06 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#40853