[GH-ISSUE #11409] Add force num_ctx option #69590

Open
opened 2026-05-04 18:35:33 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @xhejtman on GitHub (Jul 14, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11409

Hello,

would it be possible to add an env option so that num_ctx from the request is ignored and always set to the value of OLLAMA_CONTEXT_LENGTH?

The rationale behind this is that for a shared Ollama instance, it's unfortunate if users cause a runner restart by setting an incorrect num_ctx value (even one smaller than the current value).

Originally created by @xhejtman on GitHub (Jul 14, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11409 Hello, would it be possible to add an env option so that `num_ctx` from the request is ignored and always set to the value of `OLLAMA_CONTEXT_LENGTH`? The rationale behind this is that for a shared Ollama instance, it's unfortunate if users cause a runner restart by setting an incorrect `num_ctx` value (even one smaller than the current value).
GiteaMirror added the feature request label 2026-05-04 18:35:33 -05:00
Author
Owner

@rick-github commented on GitHub (Jul 14, 2025):

As a work-around, you could have the clients use the OpenAI API endpoint, or use a proxy to remove num_ctx from the request.

<!-- gh-comment-id:3069466741 --> @rick-github commented on GitHub (Jul 14, 2025): As a work-around, you could have the clients use the OpenAI API [endpoint](https://github.com/ollama/ollama/blob/main/docs/openai.md), or use a [proxy](https://github.com/ollama/ollama/issues/11002#issuecomment-2959740647) to remove `num_ctx` from the request.
Author
Owner

@xhejtman commented on GitHub (Jul 14, 2025):

As a work-around, you could have the clients use the OpenAI API endpoint, or use a proxy to remove num_ctx from the request.

indeed, I patched ollama sources to ignore num_ctx, but it would be nice, if supported directly by ollama.

openai is not always desired, as some tools such as Continue from VS Code do not work correctly via openai API while via ollama api they do.

<!-- gh-comment-id:3069510862 --> @xhejtman commented on GitHub (Jul 14, 2025): > As a work-around, you could have the clients use the OpenAI API [endpoint](https://github.com/ollama/ollama/blob/main/docs/openai.md), or use a [proxy](https://github.com/ollama/ollama/issues/11002#issuecomment-2959740647) to remove `num_ctx` from the request. indeed, I patched ollama sources to ignore `num_ctx`, but it would be nice, if supported directly by ollama. openai is not always desired, as some tools such as `Continue` from `VS Code` do not work correctly via openai API while via ollama api they do.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#69590