[GH-ISSUE #2652] Does ctransformers support ollama models? #63612

Closed
opened 2026-05-03 14:28:17 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @PriyaranjanMarathe on GitHub (Feb 21, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2652

Does ctransformers support ollama models?

How do I specify the model in this code below?

llm = CTransformers(model="***where is the model file for a ollama model?",
model_type="llama",
max_new_tokens=512

https://github.com/marella/ctransformers/issues/204

Originally created by @PriyaranjanMarathe on GitHub (Feb 21, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2652 Does ctransformers support ollama models? How do I specify the model in this code below? llm = CTransformers(model="***where is the model file for a ollama model?", model_type="llama", max_new_tokens=512 https://github.com/marella/ctransformers/issues/204
Author
Owner

@PriyaranjanMarathe commented on GitHub (Feb 21, 2024):

Was able to make it work with ollama. Used the following code sample.

llm = CTransformers(model="/usr/share/ollama/.ollama/models/blobs/sha256:8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246",
model_type="llama2",
max_new_tokens=512,
temperature=0.1)

the model is present under /usr/share/ollama/.ollama/models/blobs folder in Linux.

<!-- gh-comment-id:1957916320 --> @PriyaranjanMarathe commented on GitHub (Feb 21, 2024): Was able to make it work with ollama. Used the following code sample. llm = CTransformers(model="/usr/share/ollama/.ollama/models/blobs/sha256:8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246", model_type="llama2", max_new_tokens=512, temperature=0.1) the model is present under /usr/share/ollama/.ollama/models/blobs folder in Linux.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#63612