[GH-ISSUE #11398] template and system prompt editor #7523

Open
opened 2026-04-12 19:37:24 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @Alias4D on GitHub (Jul 12, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11398

add support for user friendly ollama files template and system prompt , parameters editor

Originally created by @Alias4D on GitHub (Jul 12, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11398 add support for user friendly ollama files template and system prompt , parameters editor
GiteaMirror added the feature request label 2026-04-12 19:37:24 -05:00
Author
Owner

@JasonHonKL commented on GitHub (Jul 13, 2025):

Note: api/generate provides template arguments which can override those in the model file.

FYI, according to api.md.

`template`: the prompt template to use (overrides what is defined in the `Modelfile`)
<!-- gh-comment-id:3066613134 --> @JasonHonKL commented on GitHub (Jul 13, 2025): Note: api/generate provides template arguments which can override those in the model file. FYI, according to api.md. ``` `template`: the prompt template to use (overrides what is defined in the `Modelfile`) ```
Author
Owner

@kha84 commented on GitHub (Aug 9, 2025):

This is another way how you can work this around. Extract the original Modelfile that being used for a model you're interested in. The model itself should be already pulled to your local ollama repo:

ollama show gpt-oss:20b --modelfile > Modelfile-gpt-oss

Then you can modify that modelfile with any editor of your choice to change whatever you like (usually it's system prompt, the default context size, temperature) and create a new model (actually just a lightweight reference) out of it:

ollama create -f Modelfile-gpt-oss gpt-oss-mine:20b 
<!-- gh-comment-id:3171876142 --> @kha84 commented on GitHub (Aug 9, 2025): This is another way how you can work this around. Extract the original Modelfile that being used for a model you're interested in. The model itself should be already pulled to your local ollama repo: ``` ollama show gpt-oss:20b --modelfile > Modelfile-gpt-oss ``` Then you can modify that modelfile with any editor of your choice to change whatever you like (usually it's system prompt, the default context size, temperature) and create a _new model_ (actually just a lightweight reference) out of it: ``` ollama create -f Modelfile-gpt-oss gpt-oss-mine:20b ```
Author
Owner

@Alias4D commented on GitHub (Aug 10, 2025):

I try it 🙏

<!-- gh-comment-id:3172553041 --> @Alias4D commented on GitHub (Aug 10, 2025): I try it 🙏
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7523