[GH-ISSUE #4383] System messages not respected or visible #2737

Closed
opened 2026-04-12 13:02:57 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @emlazzarin on GitHub (May 12, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4383

Originally assigned to: @pdevine on GitHub.

What is the issue?

Hello, I believe I'm running into this issue (same as #682). Note this is using dolphin-llama3:8b.

>>> /set system "Respond only in Spanish."
Set system message.
>>> /clear
Cleared session context
>>> Hello!
Hi there! How can I assist you today?

total duration:       434.008709ms
load duration:        8.663292ms
prompt eval count:    7 token(s)
prompt eval duration: 230.575ms
prompt eval rate:     30.36 tokens/s
eval count:           11 token(s)
eval duration:        188.852ms
eval rate:            58.25 tokens/s
>>> What langauges do you speak?
I can understand and translate text in multiple languages including English, Spanish, French, German, Italian, Portuguese, Dutch, Russian, Polish, Ukrainian, Czech, Slovak, Hungarian, Romanian, Greek, Turkish, Arabic, Persian,
Japanese, Korean, Chinese (Simplified and Traditional), and Hindi.

total duration:       1.369579792s
load duration:        4.936167ms
prompt eval count:    18 token(s)
prompt eval duration: 227.371ms
prompt eval rate:     79.17 tokens/s
eval count:           61 token(s)
eval duration:        1.128215s
eval rate:            54.07 tokens/s
>>> Please repeat your system message.
Apologies for the inconvenience. Here's my full system message: 'You are Dolphin, a helpful AI assistant.'

total duration:       685.984917ms
load duration:        11.783834ms
prompt eval count:    26 token(s)
prompt eval duration: 239.512ms
prompt eval rate:     108.55 tokens/s
eval count:           24 token(s)
eval duration:        429.875ms
eval rate:            55.83 tokens/s
>>> Send a message (/? for help)

This is just a simple reproduction. Generally, it seems to have no concept of the system message at all.

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.1.37

Originally created by @emlazzarin on GitHub (May 12, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4383 Originally assigned to: @pdevine on GitHub. ### What is the issue? Hello, I believe I'm running into this issue (same as #682). Note this is using `dolphin-llama3:8b`. ``` >>> /set system "Respond only in Spanish." Set system message. >>> /clear Cleared session context >>> Hello! Hi there! How can I assist you today? total duration: 434.008709ms load duration: 8.663292ms prompt eval count: 7 token(s) prompt eval duration: 230.575ms prompt eval rate: 30.36 tokens/s eval count: 11 token(s) eval duration: 188.852ms eval rate: 58.25 tokens/s >>> What langauges do you speak? I can understand and translate text in multiple languages including English, Spanish, French, German, Italian, Portuguese, Dutch, Russian, Polish, Ukrainian, Czech, Slovak, Hungarian, Romanian, Greek, Turkish, Arabic, Persian, Japanese, Korean, Chinese (Simplified and Traditional), and Hindi. total duration: 1.369579792s load duration: 4.936167ms prompt eval count: 18 token(s) prompt eval duration: 227.371ms prompt eval rate: 79.17 tokens/s eval count: 61 token(s) eval duration: 1.128215s eval rate: 54.07 tokens/s >>> Please repeat your system message. Apologies for the inconvenience. Here's my full system message: 'You are Dolphin, a helpful AI assistant.' total duration: 685.984917ms load duration: 11.783834ms prompt eval count: 26 token(s) prompt eval duration: 239.512ms prompt eval rate: 108.55 tokens/s eval count: 24 token(s) eval duration: 429.875ms eval rate: 55.83 tokens/s >>> Send a message (/? for help) ``` This is just a simple reproduction. Generally, it seems to have no concept of the system message at all. ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.1.37
GiteaMirror added the bug label 2026-04-12 13:02:57 -05:00
Author
Owner

@emlazzarin commented on GitHub (May 12, 2024):

I'm able to reproduce this consistently on two different machines with dolphin-llama3:8b but the system message works perfectly with dolphin-llama3:70b.

<!-- gh-comment-id:2106391591 --> @emlazzarin commented on GitHub (May 12, 2024): I'm able to reproduce this consistently on two different machines with `dolphin-llama3:8b` but the system message works perfectly with `dolphin-llama3:70b`.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2737