[GH-ISSUE #10830] Remove num_ctx parameter from mistral-small3.1:latest modelfile #7111

Open
opened 2026-04-12 19:05:55 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @dguembel-itomig on GitHub (May 23, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10830

The model file for Mistral Small 3.1 sets the number of context tokens to 4096 explicitly. This is atypical, as most other model files do not set this parameter in such a manner. Additionally, it cannot be overridden using an environment variable. Cf. #10829 for an example of the side-effects.

I suggest removing that setting from the model file, in order to make behavior consistent with mistrall-small:latest (and many other model's modelfiles :)

# docker exec -it watchtower-ollama-1 ollama run mistral-small3.1:latest 
>>> /show parameters
Model defined parameters:
num_ctx                        4096
Originally created by @dguembel-itomig on GitHub (May 23, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10830 The model file for Mistral Small 3.1 sets the number of context tokens to 4096 explicitly. This is atypical, as most other model files do not set this parameter in such a manner. Additionally, it cannot be overridden using an environment variable. Cf. #10829 for an example of the side-effects. I suggest removing that setting from the model file, in order to make behavior consistent with mistrall-small:latest (and many other model's modelfiles :) ``` # docker exec -it watchtower-ollama-1 ollama run mistral-small3.1:latest >>> /show parameters Model defined parameters: num_ctx 4096 ```
GiteaMirror added the model label 2026-04-12 19:05:55 -05:00
Author
Owner

@stephanepoinsart commented on GitHub (May 27, 2025):

I'm running into exactly the same problem, i was about to post the same ticket found yours that describe it perfectly.

We know it runs well at larger context if its forced by some ollama-protocol clients (Open WebUI), but we have some Open-AI compatible clients that are affected.

<!-- gh-comment-id:2914352461 --> @stephanepoinsart commented on GitHub (May 27, 2025): I'm running into exactly the same problem, i was about to post the same ticket found yours that describe it perfectly. We know it runs well at larger context if its forced by some ollama-protocol clients (Open WebUI), but we have some Open-AI compatible clients that are affected.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7111