[PR #2541] [CLOSED] fix: use requested model template #10926

Closed
opened 2026-04-12 23:15:43 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/2541
Author: @BruceMacD
Created: 2/16/2024
Status: Closed

Base: mainHead: brucemacd/use-req-model-chat


📝 Commits (2)

📊 Changes

1 file changed (+3 additions, -3 deletions)

View changed files

📝 server/routes.go (+3 -3)

📄 Description

As reported in scenario 1 of #2492

When a request was made to a model than inherits from the currently loaded model the system and template were not updated in the /chat endpoint. The fix is to use the requested model rather than the loaded one.

Steps to reproduce:

  1. Create a model that overrides the system prompt of another model:
FROM phi
SYSTEM """I want you to speak French only."""

ollama create phi-french -f ~/models/phi-french/Modelfile
2. Run the base model
ollama run phi
3. Quit the repl and run the custom model

ollama run phi-french

The system message from the base model was not changed, as the loaded model did not change.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/2541 **Author:** [@BruceMacD](https://github.com/BruceMacD) **Created:** 2/16/2024 **Status:** ❌ Closed **Base:** `main` ← **Head:** `brucemacd/use-req-model-chat` --- ### 📝 Commits (2) - [`d5ad923`](https://github.com/ollama/ollama/commit/d5ad9238aad066591a834d6d8945d271568aecbc) fix: use requested model template - [`dbb9665`](https://github.com/ollama/ollama/commit/dbb966587e284476871ce16701c5bfa7bbd52177) Update routes.go ### 📊 Changes **1 file changed** (+3 additions, -3 deletions) <details> <summary>View changed files</summary> 📝 `server/routes.go` (+3 -3) </details> ### 📄 Description As reported in scenario 1 of #2492 When a request was made to a model than inherits from the currently loaded model the system and template were not updated in the `/chat` endpoint. The fix is to use the requested model rather than the loaded one. Steps to reproduce: 1. Create a model that overrides the system prompt of another model: ``` FROM phi SYSTEM """I want you to speak French only.""" ``` `ollama create phi-french -f ~/models/phi-french/Modelfile` 2. Run the base model `ollama run phi` 3. Quit the repl and run the custom model ``` ollama run phi-french ``` The system message from the base model was not changed, as the loaded model did not change. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-12 23:15:43 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#10926