diff --git a/docs/context-length.mdx b/docs/context-length.mdx index 679137168..9b1cd8ae8 100644 --- a/docs/context-length.mdx +++ b/docs/context-length.mdx @@ -6,9 +6,9 @@ Context length is the maximum number of tokens that the model has access to in m Ollama defaults to the following context lengths based on VRAM: -< 24 GiB VRAM: 4,096 context -24-48 GiB VRAM: 32,768 context ->= 48 GiB VRAM: 262,144 context + - < 24 GiB VRAM: 4,096 context + - 24-48 GiB VRAM: 32,768 context + - >= 48 GiB VRAM: 262,144 context Tasks which require large context like web search, agents, and coding tools should be set to at least 64000 tokens.