[GH-ISSUE #5195] How to import a model (.bin) from huggin face? #3264

Closed
opened 2026-04-12 13:48:38 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @javierxio on GitHub (Jun 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5195

Hello. I would like to use a model from huggin face. I was able to download a file called pytorch_model.bin which I presume is the LLM. I created a directory and created a Modelfile.txt file. The contents of the Modelfile.txt are as:

FROM C:\ollama_models\florence-2-base\pytorch_model.bin

Running the ollama create command results in the following erros:

C:\ollama_models\florence-2-base>ollama create florence2:base -f ./Modelfile.txt
transferring model data
unpacking model metadata
Error: open C:\Users\javie\.ollama\models\blobs\1075676817\pytorch_model\data.pkl: The system cannot find the path specified.

Please help me understand? I am new at this. Thanks!

Originally created by @javierxio on GitHub (Jun 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5195 Hello. I would like to use a model from huggin face. I was able to download a file called `pytorch_model.bin` which I presume is the LLM. I created a directory and created a `Modelfile.txt` file. The contents of the `Modelfile.txt` are as: ``` FROM C:\ollama_models\florence-2-base\pytorch_model.bin ``` Running the ollama create command results in the following erros: ```sh C:\ollama_models\florence-2-base>ollama create florence2:base -f ./Modelfile.txt transferring model data unpacking model metadata Error: open C:\Users\javie\.ollama\models\blobs\1075676817\pytorch_model\data.pkl: The system cannot find the path specified. ``` Please help me understand? I am new at this. Thanks!
GiteaMirror added the model label 2026-04-12 13:48:38 -05:00
Author
Owner

@bmabir17 commented on GitHub (Jun 22, 2024):

I think ollama does not support bin file models. AFIK it now supports gguf only

<!-- gh-comment-id:2184153043 --> @bmabir17 commented on GitHub (Jun 22, 2024): I think ollama does not support bin file models. AFIK it now supports gguf only
Author
Owner

@mili-tan commented on GitHub (Jun 23, 2024):

Try writing up the huggingface model directory instead of the bin file. But this is only supported on some architectures.
FROM C:\ollama_models\florence-2-base\

https://github.com/ollama/ollama/blob/main/docs/import.md#automatic-quantization

<!-- gh-comment-id:2184716793 --> @mili-tan commented on GitHub (Jun 23, 2024): Try writing up the huggingface model directory instead of the bin file. But this is only supported on some architectures. `FROM C:\ollama_models\florence-2-base\` https://github.com/ollama/ollama/blob/main/docs/import.md#automatic-quantization
Author
Owner

@javierxio commented on GitHub (Jun 24, 2024):

@mili-tan thanks mili. that seems to help and now am getting to this stage:

C:\ollama_models\florence-2-base>ollama create florence2:base -f ./Modelfile.txt
transferring model data
unpacking model metadata
Error: open C:\Users\javie\.ollama\models\blobs\613013707\params.json: The system cannot find the file specified.

Seems I am now missing some other files (e.g. params.json). I will try and see if they are available in the hugging face repo.

<!-- gh-comment-id:2187039154 --> @javierxio commented on GitHub (Jun 24, 2024): @mili-tan thanks mili. that seems to help and now am getting to this stage: ```ps C:\ollama_models\florence-2-base>ollama create florence2:base -f ./Modelfile.txt transferring model data unpacking model metadata Error: open C:\Users\javie\.ollama\models\blobs\613013707\params.json: The system cannot find the file specified. ``` Seems I am now missing some other files (e.g. `params.json`). I will try and see if they are available in the hugging face repo.
Author
Owner

@javierxio commented on GitHub (Jun 24, 2024):

These are the files in hugging face. I don't know which is considered params.json but I will try to rename one of the files to that filename.

image

<!-- gh-comment-id:2187045315 --> @javierxio commented on GitHub (Jun 24, 2024): These are the files in hugging face. I don't know which is considered `params.json` but I will try to rename one of the files to that filename. ![image](https://github.com/ollama/ollama/assets/63758477/60117ad9-52d9-44df-a8ba-45a17000a46b)
Author
Owner

@javierxio commented on GitHub (Jun 24, 2024):

Ok. I downloaded config.json file and renamed as params.json

C:\ollama_models\florence-2-base>ollama create florence2:base -f ./Modelfile.txt
transferring model data
unpacking model metadata
processing tensors

It seems it went through but now cannot run run command and gives error:

C:\ollama_models\florence-2-base>ollama run florence2:base
pulling manifest
Error: pull model manifest: file does not exist
<!-- gh-comment-id:2187072315 --> @javierxio commented on GitHub (Jun 24, 2024): Ok. I downloaded `config.json` file and renamed as `params.json` ```sh C:\ollama_models\florence-2-base>ollama create florence2:base -f ./Modelfile.txt transferring model data unpacking model metadata processing tensors ``` It seems it went through but now cannot run `run` command and gives error: ```sh C:\ollama_models\florence-2-base>ollama run florence2:base pulling manifest Error: pull model manifest: file does not exist ```
Author
Owner

@javierxio commented on GitHub (Jun 30, 2024):

i have moved to llama-cpp-python for the meantime. thanks.

<!-- gh-comment-id:2198455580 --> @javierxio commented on GitHub (Jun 30, 2024): i have moved to llama-cpp-python for the meantime. thanks.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3264