[GH-ISSUE #735] What is the supported context length? llama2-chinese:13b-chat-q6_K #344

Closed
opened 2026-04-12 09:55:20 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Friedrich-hue on GitHub (Oct 8, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/735

Originally created by @Friedrich-hue on GitHub (Oct 8, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/735
Author
Owner

@konstantin1722 commented on GitHub (Oct 11, 2023):

@Friedrich-hue, Look at the num_ctx parameter, maybe that is what you are looking for?

<!-- gh-comment-id:1757473795 --> @konstantin1722 commented on GitHub (Oct 11, 2023): @Friedrich-hue, Look at the `num_ctx` parameter, maybe that is what you are looking for?
Author
Owner

@jmorganca commented on GitHub (Oct 30, 2023):

Thanks for the issue! The context is provided as an option, and the models in https://ollama.ai/library have default context sizes set based on the model. Note that models will consume significantly more memory if a large prompt is provided.

<!-- gh-comment-id:1786140800 --> @jmorganca commented on GitHub (Oct 30, 2023): Thanks for the issue! The context is provided as an option, and the models in https://ollama.ai/library have default context sizes set based on the model. Note that models will consume significantly more memory if a large prompt is provided.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#344