From c980e1999571a0887b28d837ea81a5938a93c94d Mon Sep 17 00:00:00 2001 From: Maternion <98753158+maternion@users.noreply.github.com> Date: Mon, 9 Feb 2026 00:29:25 +0530 Subject: [PATCH] Fix formatting of context length notes in documentation --- docs/context-length.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/context-length.mdx b/docs/context-length.mdx index 679137168..9b1cd8ae8 100644 --- a/docs/context-length.mdx +++ b/docs/context-length.mdx @@ -6,9 +6,9 @@ Context length is the maximum number of tokens that the model has access to in m Ollama defaults to the following context lengths based on VRAM: -< 24 GiB VRAM: 4,096 context -24-48 GiB VRAM: 32,768 context ->= 48 GiB VRAM: 262,144 context + - < 24 GiB VRAM: 4,096 context + - 24-48 GiB VRAM: 32,768 context + - >= 48 GiB VRAM: 262,144 context Tasks which require large context like web search, agents, and coding tools should be set to at least 64000 tokens.