[GH-ISSUE #1415] Override SYSTEM parameter by commandline #62791

Open
opened 2026-05-03 10:19:26 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @marco-trovato on GitHub (Dec 7, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1415

According to the documentation, the only way to change the SYSTEM is to create a new model with modelfile using an existing LLM model already downloaded as template:

ollama create choose-a-model-name -f <location of the file e.g. ./Modelfile>'

But this will copy and duplicate the model file (often bigger than 20 GB)

But using oTerm is possible to change the SYSTEM, please refer to this screenshot for visual reference:
image

REQUEST:
Please add the --system command line to force system, example usage:
ollama run codeup:13b-llama2-chat-q4_K_M --verbose --system "Roleplay as Matrix movie operator before answering the question." "Write Python code to loop for 1 to 10"

Originally created by @marco-trovato on GitHub (Dec 7, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1415 According to the [documentation](https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md), the only way to change the SYSTEM is to create a new model with modelfile using an existing LLM model already downloaded as template: `ollama create choose-a-model-name -f <location of the file e.g. ./Modelfile>'` But this will copy and duplicate the model file (often bigger than 20 GB) But using oTerm is possible to change the SYSTEM, please refer to this screenshot for visual reference: ![image](https://github.com/jmorganca/ollama/assets/18162107/42d7b3b6-3e3c-46f1-859d-29755e97ef2a) REQUEST: **Please add the --system command line to force system, example usage:** ollama run codeup:13b-llama2-chat-q4_K_M --verbose **--system** "Roleplay as Matrix movie operator before answering the question." "Write Python code to loop for 1 to 10"
GiteaMirror added the feature request label 2026-05-03 10:19:26 -05:00
Author
Owner

@BruceMacD commented on GitHub (Dec 7, 2023):

Hi @marco-trovato thanks for opening the issue, the REPL actually supports setting the system parameter as of the most recent release. Here is what that looks like:

ollama run mistral
>>> /set system role play as neo from the matrix
Set system template.

>>> will you take the red pill or the blue pill?

I take the red pill.

Does that meet your needs?

<!-- gh-comment-id:1846037955 --> @BruceMacD commented on GitHub (Dec 7, 2023): Hi @marco-trovato thanks for opening the issue, the REPL actually supports setting the system parameter as of the most recent release. Here is what that looks like: ``` ollama run mistral >>> /set system role play as neo from the matrix Set system template. >>> will you take the red pill or the blue pill? I take the red pill. ``` Does that meet your needs?
Author
Owner

@mxyng commented on GitHub (Dec 7, 2023):

To clarify, while ollama create will create a new model, the model weights are not duplicated on disk. Layers can be shared by multiple models but only one copy is actually persisted

<!-- gh-comment-id:1846127948 --> @mxyng commented on GitHub (Dec 7, 2023): To clarify, while `ollama create` will create a new model, the model weights are not duplicated on disk. Layers can be shared by multiple models but only one copy is actually persisted
Author
Owner

@marco-trovato commented on GitHub (Dec 7, 2023):

ollama run mistral

/set system role play as neo from the matrix
Does that meet your needs?

This is very useful and I didn't know it, thank you.

Unfortunately in my specific use case I am trying to use it in non interactive mode as command line for a bash script.
i.e.: ollama run codeup:13b-llama2-chat-q4_K_M --verbose "Write Python code to loop for 1 to 10"

The solution you proposed works only as interactive and requires the user to actually type the command.

<!-- gh-comment-id:1846179682 --> @marco-trovato commented on GitHub (Dec 7, 2023): > ollama run mistral > >>> /set system role play as neo from the matrix > Does that meet your needs? This is very useful and I didn't know it, thank you. Unfortunately in my specific use case I am trying to use it in **non interactive mode** as command line for a bash script. i.e.: `ollama run codeup:13b-llama2-chat-q4_K_M --verbose "Write Python code to loop for 1 to 10"` The solution you proposed works only as interactive and requires the user to actually type the command.
Author
Owner

@iplayfast commented on GitHub (Dec 15, 2023):

I agree that having all the possible / command internally in ollama would be useful to have as external parameters when starting the program.

<!-- gh-comment-id:1858361592 --> @iplayfast commented on GitHub (Dec 15, 2023): I agree that having all the possible / command internally in ollama would be useful to have as external parameters when starting the program.
Author
Owner

@cirosantilli commented on GitHub (Mar 18, 2025):

Someone semi-asking to set parameters from CLI: https://github.com/ollama/ollama/issues/2505#issuecomment-2435395442

<!-- gh-comment-id:2733205599 --> @cirosantilli commented on GitHub (Mar 18, 2025): Someone semi-asking to set parameters from CLI: https://github.com/ollama/ollama/issues/2505#issuecomment-2435395442
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#62791