[PR #11175] avoid context overflow #13465

Closed
opened 2026-04-13 00:28:03 -05:00 by GiteaMirror · 0 comments
Owner

Original Pull Request: https://github.com/ollama/ollama/pull/11175

State: closed
Merged: Yes


For smaller context models, make sure we do not exceed the training size.

While testing larger context sizes, I noticed our current 4k default context causes problems on models like orca-mini which have a 2k context max.

llama_context: n_ctx_per_seq (4096) > n_ctx_train (2048) -- possible training context overflow
**Original Pull Request:** https://github.com/ollama/ollama/pull/11175 **State:** closed **Merged:** Yes --- For smaller context models, make sure we do not exceed the training size. While testing larger context sizes, I noticed our current 4k default context causes problems on models like orca-mini which have a 2k context max. ``` llama_context: n_ctx_per_seq (4096) > n_ctx_train (2048) -- possible training context overflow ```
GiteaMirror added the pull-request label 2026-04-13 00:28:03 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#13465