[GH-ISSUE #409] Custom model based on codellama just outputs blank lines #25947

Closed
opened 2026-04-22 01:48:49 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @tomduncalf on GitHub (Aug 25, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/409

Hi, super cool project, impressed how easy it was to get started!

I created a custom model based on codellama with a system message explaining its task, but when I run it with my input, the model just infinitely outputs blank lines.

I saw on a comment on Hacker News that this was a more general problem with codellama until you fixed it, so I wondered if you had any specific thoughts or advice on what might be causing this?

Thanks!

Originally created by @tomduncalf on GitHub (Aug 25, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/409 Hi, super cool project, impressed how easy it was to get started! I created a custom model based on codellama with a system message explaining its task, but when I run it with my input, the model just infinitely outputs blank lines. I saw on a [comment on Hacker News](https://news.ycombinator.com/item?id=37252690) that this was a more general problem with codellama until you fixed it, so I wondered if you had any specific thoughts or advice on what might be causing this? Thanks!
Author
Owner

@BruceMacD commented on GitHub (Aug 25, 2023):

Hey @tomduncalf, we tried a few things. The model seems to be pretty sensitive to configuration.

What seemed to help the most with stability was setting the rope frequency base. Here is what that looks like in a Modelfile:

PARAMETER rope_frequency_base 1000000

It also seemed pretty sensitive to white-space at the end of the prompt. Here's an example Modelfile you can try working from that will get our configuration by default:

FROM codellama
TEMPLATE """
[INST] {{ if and .First .System }}<<SYS>>{{ .System }}<</SYS>>

{{ end }}{{ .Prompt }} [/INST]
"""
SYSTEM """
Provide answers in JavaScript
"""
<!-- gh-comment-id:1693795444 --> @BruceMacD commented on GitHub (Aug 25, 2023): Hey @tomduncalf, we tried a few things. The model seems to be pretty sensitive to configuration. What seemed to help the most with stability was setting the rope frequency base. Here is what that looks like in a Modelfile: ``` PARAMETER rope_frequency_base 1000000 ``` It also seemed pretty sensitive to white-space at the end of the prompt. Here's an example Modelfile you can try working from that will get our configuration by default: ``` FROM codellama TEMPLATE """ [INST] {{ if and .First .System }}<<SYS>>{{ .System }}<</SYS>> {{ end }}{{ .Prompt }} [/INST] """ SYSTEM """ Provide answers in JavaScript """ ```
Author
Owner

@tomduncalf commented on GitHub (Aug 25, 2023):

Hey @BruceMacD, that seems to have helped! Thanks very much :)

Out of interest, where can I find the templates you use internally?

Thanks,
Tom

<!-- gh-comment-id:1693844825 --> @tomduncalf commented on GitHub (Aug 25, 2023): Hey @BruceMacD, that seems to have helped! Thanks very much :) Out of interest, where can I find the templates you use internally? Thanks, Tom
Author
Owner

@BruceMacD commented on GitHub (Aug 30, 2023):

@tomduncalf

It will be in each library's description soon at https://ollama.ai/library

The other option (which works right now) is to download the model, and output the model info during an interactive session.

$ ollama run codellama
>>> /help
commands:
  /help
  /list
  /set
  ├── history
  ├── nohistory
  ├── verbose
  ├── quiet
  ├── mode
  ├────── vim
  ├────── emacs
  ├────── default
  /show
  ├── license
  ├── system
  ├── template
  /exit
  /bye

>>> /show template
[INST] {{ if and .First .System }}<<SYS>>{{ .System }}<</SYS>>

Hope that helps.

<!-- gh-comment-id:1699511208 --> @BruceMacD commented on GitHub (Aug 30, 2023): @tomduncalf It will be in each library's description soon at https://ollama.ai/library The other option (which works right now) is to download the model, and output the model info during an interactive session. ``` $ ollama run codellama >>> /help commands: /help /list /set ├── history ├── nohistory ├── verbose ├── quiet ├── mode ├────── vim ├────── emacs ├────── default /show ├── license ├── system ├── template /exit /bye >>> /show template [INST] {{ if and .First .System }}<<SYS>>{{ .System }}<</SYS>> ``` Hope that helps.
Author
Owner

@tomduncalf commented on GitHub (Aug 30, 2023):

Ooh, nice. It does help! Thanks :)

<!-- gh-comment-id:1699518489 --> @tomduncalf commented on GitHub (Aug 30, 2023): Ooh, nice. It does help! Thanks :)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#25947