[GH-ISSUE #7327] ollama create Error: open config.json: file does not exist #66710

Closed
opened 2026-05-04 07:55:26 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @dragoncdj on GitHub (Oct 23, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7327

What is the issue?

I use the create command

ollama create mymodel2 -f D:\AI\qwen7\Modelfile
but return
Error: open config.json: file does not exist
This is my Modelfile

FROM .\export\pytorch_model.bin
PARAMETER stop <|eot|>
PARAMETER top_p 0.9
PARAMETER temperature 1.0

but,There is a config. json file in the same level folder as the pytorch \ model. bin file.
I don't know what the reason is, how to solve it
微信图片_20241023092936
微信图片_202410230929326

OS

Windows

GPU

No response

CPU

Intel

Ollama version

0.3.13

Originally created by @dragoncdj on GitHub (Oct 23, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7327 ### What is the issue? I use the create command >ollama create mymodel2 -f D:\AI\qwen7\Modelfile but return Error: open config.json: file does not exist This is my Modelfile FROM .\export\pytorch_model.bin PARAMETER stop <|eot|> PARAMETER top_p 0.9 PARAMETER temperature 1.0 but,There is a config. json file in the same level folder as the pytorch \ model. bin file. I don't know what the reason is, how to solve it ![微信图片_20241023092936](https://github.com/user-attachments/assets/f62c6cdc-64c5-4e27-937e-1272e579a80d) ![微信图片_202410230929326](https://github.com/user-attachments/assets/11959da6-42e1-4124-8869-ad8824da39e8) ### OS Windows ### GPU _No response_ ### CPU Intel ### Ollama version 0.3.13
GiteaMirror added the bug label 2026-05-04 07:55:26 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 23, 2024):

Try specifying the directory that the BIN file is in:

FROM .\export
PARAMETER stop <|eot|>
PARAMETER top_p 0.9
PARAMETER temperature 1.0
<!-- gh-comment-id:2431699295 --> @rick-github commented on GitHub (Oct 23, 2024): Try specifying the directory that the BIN file is in: ```modelfile FROM .\export PARAMETER stop <|eot|> PARAMETER top_p 0.9 PARAMETER temperature 1.0 ```
Author
Owner

@pdevine commented on GitHub (Nov 13, 2024):

@rick-github has the correct answer here. Use FROM path/to/model/dir. If you specify FROM file.bin Ollama will only be able to pick up the bin file that you were specifying. Also, we don't officially support converting from pytorch models, only safetensors, so converting from the pytorch weights may or may not work in the future. The problem is that we there often isn't enough information to correctly convert the model.

<!-- gh-comment-id:2474941794 --> @pdevine commented on GitHub (Nov 13, 2024): @rick-github has the correct answer here. Use `FROM path/to/model/dir`. If you specify `FROM file.bin` Ollama will only be able to pick up the bin file that you were specifying. Also, we don't _officially_ support converting from pytorch models, only safetensors, so converting from the pytorch weights may or may not work in the future. The problem is that we there often isn't enough information to correctly convert the model.
Author
Owner

@fellipgomes commented on GitHub (Dec 30, 2025):

very very thanks @artemavrin .

I'm use Windows 11, ollama, i changed the name adapter_model.safetensors to model.safetensors and worker.

https://github.com/ollama/ollama/issues/13314

<!-- gh-comment-id:3700605413 --> @fellipgomes commented on GitHub (Dec 30, 2025): very very thanks @artemavrin . I'm use Windows 11, ollama, i changed the name adapter_model.safetensors to model.safetensors and worker. https://github.com/ollama/ollama/issues/13314
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66710