[GH-ISSUE #10488] Can't create model with Modelfile larger than 100 lines. #53410

Closed
opened 2026-04-29 03:02:01 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @gkcccvvv on GitHub (Apr 29, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10488

What is the issue?

working with gemma3:4b, developing technical service responsible AI model.

When I write more than 100lines in Modelfile, "create" command looking works on cmd but ollama chat starts responding as defaults. I delete few lines and ollama start responding regarding to the Modelfile. How can I expand this limit? Performance is in second place in my applicaiton.

Relevant log output


OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.6.6

Originally created by @gkcccvvv on GitHub (Apr 29, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10488 ### What is the issue? working with gemma3:4b, developing technical service responsible AI model. When I write more than 100lines in Modelfile, "create" command looking works on cmd but ollama chat starts responding as defaults. I delete few lines and ollama start responding regarding to the Modelfile. How can I expand this limit? Performance is in second place in my applicaiton. ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.6.6
GiteaMirror added the bug label 2026-04-29 03:02:01 -05:00
Author
Owner

@ehan1990 commented on GitHub (Apr 29, 2025):

hey @gkcccvvv , your log output section's empty btw. i think you might have forgotten to add the logs. also could you provide the modelfile that doesn't work if possible?

<!-- gh-comment-id:2840244773 --> @ehan1990 commented on GitHub (Apr 29, 2025): hey @gkcccvvv , your log output section's empty btw. i think you might have forgotten to add the logs. also could you provide the modelfile that doesn't work if possible?
Author
Owner

@rick-github commented on GitHub (Apr 29, 2025):

Which lines do you delete to make it work: SYSTEM, MESSAGE, TEMPLATE, etc?

<!-- gh-comment-id:2840276506 --> @rick-github commented on GitHub (Apr 29, 2025): Which lines do you delete to make it work: SYSTEM, MESSAGE, TEMPLATE, etc?
Author
Owner

@rick-github commented on GitHub (Apr 29, 2025):

If you haven't adjusted the size of the context window, the system message exceeds the buffer and is discarded. Add PARAMETER num_ctx 8192 to the Modelfile or set OLLAMA_CONTEXT_LENGTH=8192 in the server environment. Change 8192 to whatever you think you need to accommodate the system message and any user/assistant messages.

<!-- gh-comment-id:2840302853 --> @rick-github commented on GitHub (Apr 29, 2025): If you haven't adjusted the size of the [context window](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-specify-the-context-window-size), the system message exceeds the buffer and is discarded. Add `PARAMETER num_ctx 8192` to the Modelfile or set `OLLAMA_CONTEXT_LENGTH=8192` in the server environment. Change 8192 to whatever you think you need to accommodate the system message and any user/assistant messages.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#53410