[GH-ISSUE #7682] OpenCoder's template doesn't make sense for an instruct model #30663

Closed
opened 2026-04-22 10:33:06 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @ProjectMoon on GitHub (Nov 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7682

What is the issue?

I am testing out OpenCoder 8b. The information on the page on ollama.com implies that it's a chat model, but the template does not reference the system prompt at all. Setting the system prompt in OpenWebUI seems to do funny things with the chat (the model has a conversation with itself, because it's not generating the right stop tokens). The template as it is at the moment seems to be for a code generation model? Since it has the suffix stuff in it. Is OpenCoder on the website a chat model or a base model?

OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.4.1

Originally created by @ProjectMoon on GitHub (Nov 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7682 ### What is the issue? I am testing out OpenCoder 8b. The information on the page on ollama.com implies that it's a chat model, but the template does not reference the system prompt at all. Setting the system prompt in OpenWebUI seems to do funny things with the chat (the model has a conversation with itself, because it's not generating the right stop tokens). The template as it is at the moment seems to be for a code generation model? Since it has the suffix stuff in it. Is OpenCoder on the website a chat model or a base model? ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.4.1
GiteaMirror added the bug label 2026-04-22 10:33:06 -05:00
Author
Owner

@rick-github commented on GitHub (Nov 15, 2024):

The embedded chat_template uses system and doesn't mention FIM, so it looks like the ollama template is wrong.

{% for message in messages %}{% if loop.first and messages[0]['role'] != 'system' %}{{ '<|im_start|>system
You are OpenCoder, created by OpenCoder Team.<|im_end|>
' }}{% endif %}{{'<|im_start|>' + message['role'] + '
' + message['content'] + '<|im_end|>' + '
'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant
' }}{% endif %}
<!-- gh-comment-id:2478267789 --> @rick-github commented on GitHub (Nov 15, 2024): The embedded `chat_template` uses system and doesn't mention FIM, so it looks like the ollama template is wrong. ``` {% for message in messages %}{% if loop.first and messages[0]['role'] != 'system' %}{{ '<|im_start|>system You are OpenCoder, created by OpenCoder Team.<|im_end|> ' }}{% endif %}{{'<|im_start|>' + message['role'] + ' ' + message['content'] + '<|im_end|>' + ' '}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant ' }}{% endif %} ```
Author
Owner

@jmorganca commented on GitHub (Nov 18, 2024):

It should be a chat model – it doesn't always listen to the system prompt but it does work sometimes:

% ollama run opencoder
>>> /set system only answer in french  
Set system message.
>>> hi
Bonjour ! Comment puis-je vous aider aujourd'hui ?
 
(The assistant greets the user and waits for instructions.)

I've since removed the FIM template – it seems the tokens are present, and it works, but was not trained for this 😊

<!-- gh-comment-id:2482152537 --> @jmorganca commented on GitHub (Nov 18, 2024): It should be a chat model – it doesn't always listen to the system prompt but it does work sometimes: ``` % ollama run opencoder >>> /set system only answer in french Set system message. >>> hi Bonjour ! Comment puis-je vous aider aujourd'hui ? (The assistant greets the user and waits for instructions.) ``` I've since removed the FIM template – it seems the tokens are present, and it works, but was not trained for this 😊
Author
Owner

@ProjectMoon commented on GitHub (Nov 18, 2024):

So, this seems to work much better now. That is, until a system prompt is set. If you set a system prompt, in OpenWebUI at least, the model starts having conversations with itself.

<!-- gh-comment-id:2482863544 --> @ProjectMoon commented on GitHub (Nov 18, 2024): So, this seems to work much better now. That is, until a system prompt is set. If you set a system prompt, in OpenWebUI at least, the model starts having conversations with itself.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#30663