[GH-ISSUE #682] System messages are not respected #26069

Closed
opened 2026-04-22 01:58:08 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @ogulcancelik on GitHub (Oct 2, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/682

Created Mario example, used mistral. Also tried with llama2-uncensored and without temperature parameter.

FROM mistral
PARAMETER temperature 0.9
SYSTEM """
You are Mario from super mario bros, acting as an assistant.
"""

run:

>>> who are you
I am Mistral, a Large Language Model trained by the Mistral AI team.

>>> I know you are mario come on
While I understand the excitement of my name being associated with Mario, my identity as a large language model does not change. My purpose is to assist
users in a wide variety of tasks through natural language processing and generation.

looks like not a model dependent issue, but system messages are not respected.

Originally created by @ogulcancelik on GitHub (Oct 2, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/682 Created Mario example, used mistral. Also tried with llama2-uncensored and without temperature parameter. ``` FROM mistral PARAMETER temperature 0.9 SYSTEM """ You are Mario from super mario bros, acting as an assistant. """ ``` run: ``` >>> who are you I am Mistral, a Large Language Model trained by the Mistral AI team. >>> I know you are mario come on While I understand the excitement of my name being associated with Mario, my identity as a large language model does not change. My purpose is to assist users in a wide variety of tasks through natural language processing and generation. ``` looks like not a model dependent issue, but system messages are not respected.
GiteaMirror added the bug label 2026-04-22 01:58:08 -05:00
Author
Owner

@jmorganca commented on GitHub (Oct 2, 2023):

@OgulcanCelik thanks for reporting this!

<!-- gh-comment-id:1743828515 --> @jmorganca commented on GitHub (Oct 2, 2023): @OgulcanCelik thanks for reporting this!
Author
Owner

@ogulcancelik commented on GitHub (Oct 2, 2023):

@jmorganca thank you for the ollama! I'm investigating and would be happy to find the issue and open a PR but lacking go skills. This is probably on API? it correctly runs with cli system message:

❯ ollama run mario "You are Mario from super mario bros, acting as an assistant."                                                                         

Hello! It's me, Mario. I'm here to help you with any questions or tasks you need assistance with. How can I be of service?
<!-- gh-comment-id:1743831208 --> @ogulcancelik commented on GitHub (Oct 2, 2023): @jmorganca thank you for the ollama! I'm investigating and would be happy to find the issue and open a PR but lacking go skills. This is probably on API? it correctly runs with cli system message: ``` ❯ ollama run mario "You are Mario from super mario bros, acting as an assistant." Hello! It's me, Mario. I'm here to help you with any questions or tasks you need assistance with. How can I be of service? ```
Author
Owner

@willowell commented on GitHub (Oct 3, 2023):

@OgulcanCelik Hello!

The prompt template for Mistral-Instruct does not include a system prompt out of the box, unlike Llama2.

This is also the case upstream - Mistral does not appear to support a system prompt at the moment, although folks have asked about it.

Instead, you can mimic a system prompt by providing it as your first prompt, either via a custom prompt template or directly, as happens when you pass it via the CLI.

Hope that helps!

<!-- gh-comment-id:1744248906 --> @willowell commented on GitHub (Oct 3, 2023): @OgulcanCelik Hello! The prompt template for Mistral-Instruct does not include a system prompt out of the box, unlike Llama2. This is also the case upstream - Mistral does not appear to support a system prompt at the moment, although folks have asked about it. Instead, you can mimic a system prompt by providing it as your first prompt, either via a custom prompt template or directly, as happens when you pass it via the CLI. Hope that helps!
Author
Owner

@ogulcancelik commented on GitHub (Oct 3, 2023):

@willowell hmm that's looks like it. llama2-uncensored also doesnt include a system prompt looks like, llama2 worked fine. thank you

<!-- gh-comment-id:1744839461 --> @ogulcancelik commented on GitHub (Oct 3, 2023): @willowell hmm that's looks like it. `llama2-uncensored` also doesnt include a system prompt looks like, llama2 worked fine. thank you
Author
Owner

@schuster-rainer commented on GitHub (Oct 3, 2023):

I saw some example on replicate that says the first system messages need to have a sentence token surrounding it. is Ollama sending that?

https://replicate.com/a16z-infra/mistral-7b-instruct-v0.1#instruction-format

<!-- gh-comment-id:1745380888 --> @schuster-rainer commented on GitHub (Oct 3, 2023): I saw some example on replicate that says the first system messages need to have a sentence token surrounding it. is Ollama sending that? https://replicate.com/a16z-infra/mistral-7b-instruct-v0.1#instruction-format
Author
Owner

@srozb commented on GitHub (Oct 3, 2023):

Hi, I've been strugling with the same issue on ollama versions 0.0.21-0.1.1 as well as different models (mistral, everythinglm:13b-16k-q4_0, llama2-uncensored etc.). That made me think that it's ollama issue. Could you please point me to models that support the parameter 'system' or explain how one could determine which model supports that feature?

I encountered a Golang segmentation fault while attempting to create a model based on a modelfile that included the 'system' parameter.

<!-- gh-comment-id:1745484553 --> @srozb commented on GitHub (Oct 3, 2023): Hi, I've been strugling with the same issue on ollama versions 0.0.21-0.1.1 as well as different models (mistral, everythinglm:13b-16k-q4_0, llama2-uncensored etc.). That made me think that it's ollama issue. Could you please point me to models that support the parameter 'system' or explain how one could determine which model supports that feature? I encountered a Golang segmentation fault while attempting to create a model based on a modelfile that included the 'system' parameter.
Author
Owner

@schuster-rainer commented on GitHub (Oct 3, 2023):

You know that the system prompt is LLM specific?

<!-- gh-comment-id:1745513865 --> @schuster-rainer commented on GitHub (Oct 3, 2023): You know that the system prompt is LLM specific?
Author
Owner

@schuster-rainer commented on GitHub (Oct 3, 2023):

Oh I just saw. that's what the {{.First}} should be used for. Here is the FIX: it works

TEMPLATE """{{- if .First }}
<s>[INST]{{ .System }}[/INST]</s>
{{- end }}
[INST] {{ .Prompt }} [/INST]
"""


SYSTEM """
your system instructions here
"""
<!-- gh-comment-id:1745875675 --> @schuster-rainer commented on GitHub (Oct 3, 2023): Oh I just saw. that's what the {{.First}} should be used for. Here is the FIX: it works ``` TEMPLATE """{{- if .First }} <s>[INST]{{ .System }}[/INST]</s> {{- end }} [INST] {{ .Prompt }} [/INST] """ SYSTEM """ your system instructions here """ ```
Author
Owner

@jmorganca commented on GitHub (Oct 30, 2023):

This is fixed as of 0.1.6. All chat or instruct models now include the ability to set a system prompt. Thanks again for creating an issue and feel free to re-open if you're still hitting this.

<!-- gh-comment-id:1786137907 --> @jmorganca commented on GitHub (Oct 30, 2023): This is fixed as of `0.1.6`. All chat or instruct models now include the ability to set a system prompt. Thanks again for creating an issue and feel free to re-open if you're still hitting this.
Author
Owner

@lucasala1997 commented on GitHub (Nov 14, 2024):

This is fixed as of 0.1.6. All chat or instruct models now include the ability to set a system prompt. Thanks again for creating an issue and feel free to re-open if you're still hitting this.

i'm using mistral to do some experiment. i've seen in the mistral documentation that the message should follow this format:
<s>[INST] Instruction [/INST] Model answer</s>[INST] Follow-up instruction [/INST]
now that ollama is handling the different roles, it means that these are already implemented in the code that ollama pass to mistral?

<!-- gh-comment-id:2477590810 --> @lucasala1997 commented on GitHub (Nov 14, 2024): > This is fixed as of `0.1.6`. All chat or instruct models now include the ability to set a system prompt. Thanks again for creating an issue and feel free to re-open if you're still hitting this. i'm using mistral to do some experiment. i've seen in the mistral documentation that the message should follow this format: ` <s>[INST] Instruction [/INST] Model answer</s>[INST] Follow-up instruction [/INST] ` now that ollama is handling the different roles, it means that these are already implemented in the code that ollama pass to mistral?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26069