[GH-ISSUE #5383] Referring offline downloaded models in code #29124

Closed
opened 2026-04-22 07:47:17 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @RaoPisay on GitHub (Jun 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5383

What is the issue?

Need help: I trying to refer model downloaded from ollama. I know the path where it is downloaded ~/.ollama/models/*
In the python code given
Python code --
tokenizer = AutoTokenizer.from_pretrained(model)
-- Python code

Here I want to mentioned the model variable with the path for example like I've downloaded 2 models and those are llama3 and gemma:2b
When I navigate to the path ~/.ollama/models I see 2 folder and those are blobs and manifests
In blobs folder I see the big files 4.7GB refers to llama3 and 1.7GB refers to gemma:2b.
I tried both the files path in placemodel variable but no luck
Below is the SS of the path just for reference
image

has anyone tried to refer ollama downloaded offline models in their code like the python statement given above?

OS

Windows

GPU

Nvidia

CPU

AMD

Ollama version

0.1.48

Originally created by @RaoPisay on GitHub (Jun 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5383 ### What is the issue? Need help: I trying to refer model downloaded from ollama. I know the path where it is downloaded `~/.ollama/models/*` In the python code given Python code -- `tokenizer = AutoTokenizer.from_pretrained(model)` -- Python code Here I want to mentioned the `model` variable with the path for example like I've downloaded 2 models and those are `llama3` and `gemma:2b` When I navigate to the path `~/.ollama/models` I see 2 folder and those are `blobs` and `manifests` In `blobs` folder I see the big files 4.7GB refers to `llama3` and 1.7GB refers to `gemma:2b`. I tried both the files path in place`model` variable but no luck Below is the SS of the path just for reference ![image](https://github.com/ollama/ollama/assets/8242864/d4fb1c14-b592-409f-8ad6-e65d31c7a9e4) has anyone tried to refer ollama downloaded offline models in their code like the python statement given above? ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.1.48
GiteaMirror added the bug label 2026-04-22 07:47:17 -05:00
Author
Owner

@rick-github commented on GitHub (Jun 29, 2024):

The files that ollama (actually llama.cpp) uses are in GGUF format, different from the format of the files used in transformers. There is some support for GGUF in transformers and you need to specify it's a GGUF file with gguf_file. Note it only does a few architectures, so your downloaded models may not work and you might have to try a different one.

<!-- gh-comment-id:2198325973 --> @rick-github commented on GitHub (Jun 29, 2024): The files that ollama (actually llama.cpp) uses are in GGUF format, different from the format of the files used in transformers. There is [some support](https://huggingface.co/docs/transformers/v4.42.0/gguf) for GGUF in transformers and you need to specify it's a GGUF file with `gguf_file`. Note it only does a few architectures, so your downloaded models may not work and you might have to try a different one.
Author
Owner

@RaoPisay commented on GitHub (Jul 1, 2024):

Thank you for the clarification; I had been searching extensively online to find a method that aligned with my needs

I appreciate that Ollama downloads the entire LLM model, although I've encountered challenges in utilizing it as intended. Perhaps exploring enhancements could further optimize its usability.

<!-- gh-comment-id:2200492692 --> @RaoPisay commented on GitHub (Jul 1, 2024): Thank you for the clarification; I had been searching extensively online to find a method that aligned with my needs I appreciate that Ollama downloads the entire LLM model, although I've encountered challenges in utilizing it as intended. Perhaps exploring enhancements could further optimize its usability.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29124