[PR #14464] openai: support max_completion_tokens parameter #45937

Open
opened 2026-04-25 01:32:10 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/14464
Author: @Anandesh-Sharma
Created: 2/26/2026
Status: 🔄 Open

Base: mainHead: fix-openai-max-completion-tokens


📝 Commits (1)

  • 89c13ab openai: support max_completion_tokens parameter

📊 Changes

1 file changed (+21 additions, -13 deletions)

View changed files

📝 openai/openai.go (+21 -13)

📄 Description

Summary

  • Adds max_completion_tokens field to both ChatCompletionRequest and CompletionRequest structs
  • When both max_completion_tokens and max_tokens are provided, max_completion_tokens takes precedence, matching OpenAI's API behavior
  • Maps the value to Ollama's num_predict option

This aligns Ollama's OpenAI compatibility layer with the newer max_completion_tokens parameter that OpenAI introduced as the preferred replacement for max_tokens.

Fixes #7125


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/14464 **Author:** [@Anandesh-Sharma](https://github.com/Anandesh-Sharma) **Created:** 2/26/2026 **Status:** 🔄 Open **Base:** `main` ← **Head:** `fix-openai-max-completion-tokens` --- ### 📝 Commits (1) - [`89c13ab`](https://github.com/ollama/ollama/commit/89c13ab82eb1a0ed41bb001bde579aa4f9d13dff) openai: support max_completion_tokens parameter ### 📊 Changes **1 file changed** (+21 additions, -13 deletions) <details> <summary>View changed files</summary> 📝 `openai/openai.go` (+21 -13) </details> ### 📄 Description ## Summary - Adds `max_completion_tokens` field to both `ChatCompletionRequest` and `CompletionRequest` structs - When both `max_completion_tokens` and `max_tokens` are provided, `max_completion_tokens` takes precedence, matching OpenAI's API behavior - Maps the value to Ollama's `num_predict` option This aligns Ollama's OpenAI compatibility layer with the newer `max_completion_tokens` parameter that OpenAI introduced as the preferred replacement for `max_tokens`. Fixes #7125 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-25 01:32:10 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#45937