[PR #6514] [MERGED] Implicit openai model parameter multiplication disabled #74427

Closed
opened 2026-05-05 06:30:17 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/6514
Author: @yaroslavyaroslav
Created: 8/26/2024
Status: Merged
Merged: 9/7/2024
Merged by: @jmorganca

Base: mainHead: patch-1


📝 Commits (2)

  • d2461f0 Implicit openai model parameter multiplication disabled
  • a0daca0 Update openai_test.go

📊 Changes

2 files changed (+4 additions, -4 deletions)

View changed files

📝 openai/openai.go (+3 -3)
📝 openai/openai_test.go (+1 -1)

📄 Description

Current state of openai.go setup makes absolutely valid openai config to be broken. This happens because of implicit doubling the config numbers performed in it.

I see the idea of making OpenAI API endpoint compatible with native ollama endpoint, but I think it've made wrong, as again, it leads to completely valid OpenAI config makes model goes wild.

Thus user is unable to use the same config for openai and ollama without modifications outside of model's field.

{
    "url": "http://localhost:11434",
    "token": "sk-your-token",
    "status_hint": [
        "name",
        "prompt_mode",
        "chat_model"
    ],
    "assistants": [
        {
            "name": "qwen2",
            "chat_model": "qwen2:1.5b",
            "assistant_role": "You are a senior python and sublime text 4 code assistant",
            "prompt_mode": "panel",
            "temperature": 1, // makes model go insane, coz the temperature is 2 on ollama's side.
            "max_tokens": 1048,
            "top_p": 1,
            "frequency_penalty": 0, // doubles as well
            "presence_penalty": 0, // doubles as well
        }
    ]
}

Closes: #6492
Affects: https://github.com/yaroslavyaroslav/OpenAI-sublime-text/issues/57


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/6514 **Author:** [@yaroslavyaroslav](https://github.com/yaroslavyaroslav) **Created:** 8/26/2024 **Status:** ✅ Merged **Merged:** 9/7/2024 **Merged by:** [@jmorganca](https://github.com/jmorganca) **Base:** `main` ← **Head:** `patch-1` --- ### 📝 Commits (2) - [`d2461f0`](https://github.com/ollama/ollama/commit/d2461f0b49a379cc134c7445d4e242ff441fc29b) Implicit openai model parameter multiplication disabled - [`a0daca0`](https://github.com/ollama/ollama/commit/a0daca0f985e26cb62d843188da6d26dfeab4aa6) Update openai_test.go ### 📊 Changes **2 files changed** (+4 additions, -4 deletions) <details> <summary>View changed files</summary> 📝 `openai/openai.go` (+3 -3) 📝 `openai/openai_test.go` (+1 -1) </details> ### 📄 Description Current state of openai.go setup makes absolutely valid openai config to be broken. This happens because of implicit doubling the config numbers performed in it. I see the idea of making OpenAI API endpoint compatible with native ollama endpoint, but I think it've made wrong, as again, it leads to completely valid OpenAI config makes model goes wild. Thus user is unable to use the same config for openai and ollama without modifications outside of model's field. ```json { "url": "http://localhost:11434", "token": "sk-your-token", "status_hint": [ "name", "prompt_mode", "chat_model" ], "assistants": [ { "name": "qwen2", "chat_model": "qwen2:1.5b", "assistant_role": "You are a senior python and sublime text 4 code assistant", "prompt_mode": "panel", "temperature": 1, // makes model go insane, coz the temperature is 2 on ollama's side. "max_tokens": 1048, "top_p": 1, "frequency_penalty": 0, // doubles as well "presence_penalty": 0, // doubles as well } ] } ``` Closes: #6492 Affects: https://github.com/yaroslavyaroslav/OpenAI-sublime-text/issues/57 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-05-05 06:30:17 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#74427