[GH-ISSUE #10094] Unable to Unset SYSTEM Prompt When Creating New Models with FROM #68675

Open
opened 2026-05-04 14:48:26 -05:00 by GiteaMirror · 9 comments
Owner

Originally created by @blakkd on GitHub (Apr 2, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10094

Description:

When adding a new model to the local library using FROM pointing to a model where the SYSTEM prompt has already been set, there is no way to unset the SYSTEM prompt for the new model, we can only eventually change it. Setting it to an empty string ("") or multiple quotes ("""""") does not work; the SYSTEM prompt reverts to the one from the original FROM model.

Steps to Reproduce:

  1. Create a model with a specified SYSTEM prompt.
  2. Use FROM in a new modelfile to reference this existing model.
  3. Attempt to set the SYSTEM prompt to an empty string in the new model's configuration.

Expected Behavior:

The SYSTEM prompt should be unset according to the new model's configuration.

Actual Behavior:

The SYSTEM prompt remains unchanged and reverts to the one from the original FROM model.

Workaround: Using a gguf file to create a "virgin" model

  • Download a gguf file.
  • Create a new model with FROM pointing to this gguf file (I guess hoping it has a different checksum).
  • This creates new blobs and allows setting the SYSTEM prompt not to be set.
  • Use this virgin model to create new models.

Worst case scenario:

  • When pulling models from ollama.com, if the SYSTEM prompt is already set, there is currently no way to unset it without downloading a whole gguf file instead and manually importing it using a modelfile without SYSTEM.

OS

Linux

Ollama version

0.6.3


I'd be happy to help debugging, even if I think it's easily reproducible.

Originally created by @blakkd on GitHub (Apr 2, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10094 ### Description: When adding a new model to the local library using `FROM` pointing to a model where the `SYSTEM` prompt has already been set, there is no way to unset the `SYSTEM` prompt for the new model, we can only eventually change it. Setting it to an empty string (`""`) or multiple quotes (`""""""`) does not work; the `SYSTEM` prompt reverts to the one from the original `FROM` model. ### Steps to Reproduce: 1. Create a model with a specified `SYSTEM` prompt. 2. Use `FROM` in a new modelfile to reference this existing model. 3. Attempt to set the `SYSTEM` prompt to an empty string in the new model's configuration. ### Expected Behavior: The `SYSTEM` prompt should be unset according to the new model's configuration. ### Actual Behavior: The `SYSTEM` prompt remains unchanged and reverts to the one from the original `FROM` model. ### Workaround: **Using a gguf file to create a "virgin" model** - Download a gguf file. - Create a new model with `FROM` pointing to this gguf file (I guess hoping it has a different checksum). - This creates new blobs and allows setting the `SYSTEM` prompt not to be set. - Use this virgin model to create new models. ### Worst case scenario: - When pulling models from ollama.com, if the `SYSTEM` prompt is already set, there is currently no way to unset it without downloading a whole gguf file instead and manually importing it using a modelfile without `SYSTEM`. ### OS Linux ### Ollama version 0.6.3 --- I'd be happy to help debugging, even if I think it's easily reproducible.
GiteaMirror added the bug label 2026-05-04 14:48:26 -05:00
Author
Owner

@rick-github commented on GitHub (Apr 2, 2025):

A fully specified Modelfile for the new model accomplishes this, no need for a new GGUF.

$ ollama show --system qwen2.5:0.5b
You are Qwen, created by Alibaba Cloud. You are a helpful assistant.
$ ollama show --modelfile qwen2.5:0.5b | grep -v SYSTEM > Modelfile
$ ollama create qwen2.5:0.5b-nosystem
$ ollama show --system qwen2.5:0.5b-nosystem
$
<!-- gh-comment-id:2773671808 --> @rick-github commented on GitHub (Apr 2, 2025): A fully specified Modelfile for the new model accomplishes this, no need for a new GGUF. ```sh $ ollama show --system qwen2.5:0.5b You are Qwen, created by Alibaba Cloud. You are a helpful assistant. $ ollama show --modelfile qwen2.5:0.5b | grep -v SYSTEM > Modelfile $ ollama create qwen2.5:0.5b-nosystem $ ollama show --system qwen2.5:0.5b-nosystem $ ```
Author
Owner

@blakkd commented on GitHub (Apr 2, 2025):

How you do this magic!??


Oh ok, I managed to replicate your successful case!
It seems it only happens when FROM is pointing to blobs already containing the system prompt.

During more than a year, as per the instructions in the generated modelfile header, I've always been removing the FROM pointing to the existing blobs.

Modelfile generated by "ollama show"
To build a new Modelfile based on this, replace FROM with:
FROM EXAONE-Deep-7B_test_system:latest

FROM /home/user/.ollama/models/blobs/sha256-d4994f8326a5d125763fddcd5be4e4fbc6e936ee06c191c19b56b66e66193ec2 <== always removed this part

I NEVER tried to let it point to a blob like you did here! If ever I forgot to remove it, I noticed the import taking longer, thought something was wrong and saw my "mistake" in the modelfile!

So what is going on here!? What should we do?

<!-- gh-comment-id:2773991231 --> @blakkd commented on GitHub (Apr 2, 2025): How you do this magic!?? --- Oh ok, I managed to replicate your successful case! It seems it only happens when FROM is pointing to blobs already containing the system prompt. During more than a year, as per the instructions in the generated modelfile header, I've always been removing the `FROM` pointing to the existing blobs. > Modelfile generated by "ollama show" > To build a new Modelfile based on this, replace FROM with: > FROM EXAONE-Deep-7B_test_system:latest > > ~FROM /home/user/.ollama/models/blobs/sha256-d4994f8326a5d125763fddcd5be4e4fbc6e936ee06c191c19b56b66e66193ec2~ <== always removed this part I NEVER tried to let it point to a blob like you did here! If ever I forgot to remove it, I noticed the import taking longer, thought something was wrong and saw my "mistake" in the modelfile! So what is going on here!? What should we do?
Author
Owner

@rick-github commented on GitHub (Apr 3, 2025):

This is WAI, but I think you're right in that specifying an empty SYSTEM should override one that's imported via "FROM model". I had a brief look at the code but didn't put my finger on where to make this change. When I get some time I'll have a closer look, in the meantime the more verbose full Modelfile should do the trick.

<!-- gh-comment-id:2774009961 --> @rick-github commented on GitHub (Apr 3, 2025): This is WAI, but I think you're right in that specifying an empty SYSTEM should override one that's imported via "FROM model". I had a brief look at the code but didn't put my finger on where to make this change. When I get some time I'll have a closer look, in the meantime the more verbose full Modelfile should do the trick.
Author
Owner

@blakkd commented on GitHub (Apr 3, 2025):

This is WAI, but I think you're right in that specifying an empty SYSTEM should override one that's imported via "FROM model". I had a brief look at the code but didn't put my finger on where to make this change. When I get some time I'll have a closer look, in the meantime the more verbose full Modelfile should do the trick.

Yeah we have to fix this, I'll try to have a look too.
I also think it is an unattended behavior, along with being sneaky for the end user assuming the metadata of the created model would reflect what they put in their modelfile.

For now, already thanks for the trick!

PS: What is "WAI"? Is it "WHY" in a fancy WAY? :D

<!-- gh-comment-id:2774026949 --> @blakkd commented on GitHub (Apr 3, 2025): > This is WAI, but I think you're right in that specifying an empty SYSTEM should override one that's imported via "FROM model". I had a brief look at the code but didn't put my finger on where to make this change. When I get some time I'll have a closer look, in the meantime the more verbose full Modelfile should do the trick. Yeah we have to fix this, I'll try to have a look too. I also think it is an unattended behavior, along with being sneaky for the end user assuming the metadata of the created model would reflect what they put in their modelfile. For now, already thanks for the trick! PS: What is "WAI"? Is it "WHY" in a fancy WAY? :D
Author
Owner

@rick-github commented on GitHub (Apr 3, 2025):

Working As Intended.

<!-- gh-comment-id:2774029107 --> @rick-github commented on GitHub (Apr 3, 2025): Working As Intended.
Author
Owner

@blakkd commented on GitHub (Apr 3, 2025):

Oh OK! Haha, but in that case, I'd rather say it is clearly not, as it not compatible with the given instructions!
Trying to look at parser/parser.go, but my skills are not existing and my LM is small :D But we'll see if that still lead to something!

<!-- gh-comment-id:2774041608 --> @blakkd commented on GitHub (Apr 3, 2025): Oh OK! Haha, but in that case, I'd rather say it is clearly not, as it not compatible with the given instructions! Trying to look at `parser/parser.go`, but my skills are not existing and my LM is small :D But we'll see if that still lead to something!
Author
Owner

@rick-github commented on GitHub (Apr 3, 2025):

It is compatible with the documentation, in that it doesn't indicate that a blank SYSTEM message will override an existing message, only that a non-blank message will be used in a template. So it's either poor documentation or a missing feature.

<!-- gh-comment-id:2774047869 --> @rick-github commented on GitHub (Apr 3, 2025): It is compatible with the documentation, in that it [doesn't indicate](https://github.com/ollama/ollama/blob/main/docs/modelfile.md#system) that a blank SYSTEM message will override an existing message, only that a non-blank message will be used in a template. So it's either poor documentation or a missing feature.
Author
Owner

@blakkd commented on GitHub (Apr 3, 2025):

It seems the issue isn't lying in parser/parser.go. We found what we thought was the culprit in server/create.go:

func setSystem(layers []Layer, s string) ([]Layer, error) {
	layers = removeLayer(layers, "application/vnd.ollama.image.system")
	if s != "" {
		blob := strings.NewReader(s)
		layer, err := NewLayer(blob, "application/vnd.ollama.image.system")
		if err != nil {
			return nil, err
		}
		layers = append(layers, layer)
	}
	return layers, nil
}

Tried simply removing the empty string condition if s != "" {}, but the behavior still persists so far: need to dig deeper.

<!-- gh-comment-id:2774125587 --> @blakkd commented on GitHub (Apr 3, 2025): It seems the issue isn't lying in `parser/parser.go`. We found what we thought was the culprit in `server/create.go`: ``` func setSystem(layers []Layer, s string) ([]Layer, error) { layers = removeLayer(layers, "application/vnd.ollama.image.system") if s != "" { blob := strings.NewReader(s) layer, err := NewLayer(blob, "application/vnd.ollama.image.system") if err != nil { return nil, err } layers = append(layers, layer) } return layers, nil } ``` Tried simply removing the empty string condition `if s != "" {}`, but the behavior still persists so far: need to dig deeper.
Author
Owner

@blakkd commented on GitHub (Apr 3, 2025):

Ok, I had to call the big brother (Gemini 2.5 Pro Experimental 03-25). There was another conditional check to ditch out.
I don't know if it's ok to do so as I only have little idea of what I'm doing. That said, the behavior is fixed. and we only touched to the system prompt part. But if there is a better, upstream/more elegant way to achieve this, anyone, please suggest an alternative.

Pasting the raw LM output below, worth 1000 of my words:


  1. Ensure setSystem always adds a layer:

    --- a/server/create.go
    +++ b/server/create.go
    @@ -556,14 +556,13 @@
    
     func setSystem(layers []Layer, s string) ([]Layer, error) {
            layers = removeLayer(layers, "application/vnd.ollama.image.system")
    -       if s != "" {
    -               blob := strings.NewReader(s)
    -               layer, err := NewLayer(blob, "application/vnd.ollama.image.system")
    -               if err != nil {
    -                       return nil, err
    -               }
    -               layers = append(layers, layer)
    +       // Always create a layer, even if s is empty.
    +       // An empty layer explicitly means "no system prompt", preventing inheritance.
    +       blob := strings.NewReader(s)
    +       layer, err := NewLayer(blob, "application/vnd.ollama.image.system")
    +       if err != nil {
    +               return nil, err
            }
    +       layers = append(layers, layer)
            return layers, nil
     }
    
    
  2. Ensure createModel always calls setSystem:

    --- a/server/create.go
    +++ b/server/create.go
    @@ -351,13 +351,11 @@
     		}
     	}
    
    -	if r.System != "" {
    -		// This calls removeLayer then adds the new one (even if empty, with the previous fix)
    -		layers, err = setSystem(layers, r.System)
    -		if err != nil {
    -			return err
    -		}
    -	}
    +	// Always call setSystem. If r.System is "", setSystem (with previous fix)
    +	// removes any inherited layer and adds an empty one.
    +	// If the user omitted SYSTEM entirely, r.System is "" and this achieves the desired "no system prompt".
    +	// If the user specified SYSTEM "", r.System is "" and this also achieves the desired effect.
    +	layers, err = setSystem(layers, r.System)
    +	if err != nil {
    +		return err
    + }
    
     	if r.License != nil {
     		switch l := r.License.(type) {
    
    

This combination ensures that:

  1. setSystem correctly creates an empty layer when given an empty string.
  2. createModel always invokes setSystem, guaranteeing that any inherited system layer is removed and replaced by the one specified (or explicitly removed by an empty one) in the current Modelfile.
<!-- gh-comment-id:2774179149 --> @blakkd commented on GitHub (Apr 3, 2025): Ok, I had to call the big brother (Gemini 2.5 Pro Experimental 03-25). There was another conditional check to ditch out. I don't know if it's ok to do so as I only have little idea of what I'm doing. That said, the behavior is fixed. and we only touched to the system prompt part. But if there is a better, upstream/more elegant way to achieve this, anyone, please suggest an alternative. Pasting the raw LM output below, worth 1000 of my words: --- 1. **Ensure `setSystem` always adds a layer:** ```diff --- a/server/create.go +++ b/server/create.go @@ -556,14 +556,13 @@ func setSystem(layers []Layer, s string) ([]Layer, error) { layers = removeLayer(layers, "application/vnd.ollama.image.system") - if s != "" { - blob := strings.NewReader(s) - layer, err := NewLayer(blob, "application/vnd.ollama.image.system") - if err != nil { - return nil, err - } - layers = append(layers, layer) + // Always create a layer, even if s is empty. + // An empty layer explicitly means "no system prompt", preventing inheritance. + blob := strings.NewReader(s) + layer, err := NewLayer(blob, "application/vnd.ollama.image.system") + if err != nil { + return nil, err } + layers = append(layers, layer) return layers, nil } ``` 2. **Ensure `createModel` always calls `setSystem`:** ```diff --- a/server/create.go +++ b/server/create.go @@ -351,13 +351,11 @@ } } - if r.System != "" { - // This calls removeLayer then adds the new one (even if empty, with the previous fix) - layers, err = setSystem(layers, r.System) - if err != nil { - return err - } - } + // Always call setSystem. If r.System is "", setSystem (with previous fix) + // removes any inherited layer and adds an empty one. + // If the user omitted SYSTEM entirely, r.System is "" and this achieves the desired "no system prompt". + // If the user specified SYSTEM "", r.System is "" and this also achieves the desired effect. + layers, err = setSystem(layers, r.System) + if err != nil { + return err + } if r.License != nil { switch l := r.License.(type) { ``` This combination ensures that: 1. `setSystem` correctly creates an empty layer when given an empty string. 2. `createModel` *always* invokes `setSystem`, guaranteeing that any inherited system layer is removed and replaced by the one specified (or explicitly removed by an empty one) in the current Modelfile.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68675