[GH-ISSUE #1327] Modelfile prompt should support chat / multiturn. #689

Closed
opened 2026-04-12 10:22:07 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @ehartford on GitHub (Nov 30, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1327

Originally assigned to: @BruceMacD on GitHub.

https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#template

So basically all that's coming in is .Prompt which is just a string.

But that can't handle chat and multi turn.

What's coming in should look a messages array. then this template should format that into a prompt.

[
  { "role": "system", "content": "You are a helpful AI assistant" },
  { "role": "user", "content": "Hello AI, How are you today?" },
  { "role": "assistant", "content": "I have no notion of time.  State your question?" },
  { "role": "user", "content": "Oh ok then, tell me the 38th state" }
]

then the template in the modelfile would look something like

{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}<|im_start|>assistant

Basically the idea that a prompt consists of a single system message and a single user message, isn't how most models actually work.

Originally created by @ehartford on GitHub (Nov 30, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1327 Originally assigned to: @BruceMacD on GitHub. https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#template So basically all that's coming in is .Prompt which is just a string. But that can't handle chat and multi turn. What's coming in should look a messages array. then this template should format that into a prompt. ``` [ { "role": "system", "content": "You are a helpful AI assistant" }, { "role": "user", "content": "Hello AI, How are you today?" }, { "role": "assistant", "content": "I have no notion of time. State your question?" }, { "role": "user", "content": "Oh ok then, tell me the 38th state" } ] ``` then the template in the modelfile would look something like ``` {% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}<|im_start|>assistant ``` Basically the idea that a prompt consists of a single system message and a single user message, isn't how most models actually work.
GiteaMirror added the feature request label 2026-04-12 10:22:07 -05:00
Author
Owner

@BruceMacD commented on GitHub (Dec 4, 2023):

Hey Eric, I just merged a /chat API that will be in the next release which should cover this.

Here's the doc:
https://github.com/jmorganca/ollama/blob/main/docs/api.md#send-chat-messages

It uses messages in the same format that you've suggested here. The template in the Modelfile is still the same right now, but the behavior matches what you suggested here too (templating for each set of messages).

Thanks for the feedback let us know if you notice anything else that doesn't seem right.

<!-- gh-comment-id:1839711562 --> @BruceMacD commented on GitHub (Dec 4, 2023): Hey Eric, I just merged a `/chat` API that will be in the next release which should cover this. Here's the doc: https://github.com/jmorganca/ollama/blob/main/docs/api.md#send-chat-messages It uses messages in the same format that you've suggested here. The template in the Modelfile is still the same right now, but the behavior matches what you suggested here too (templating for each set of messages). Thanks for the feedback let us know if you notice anything else that doesn't seem right.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#689