[GH-ISSUE #10628] Way to mount locally downloaded models from hugging face to ollama docker container #6992

Closed
opened 2026-04-12 18:53:10 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @Vijaygawate on GitHub (May 9, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10628

Hello,

I have a directory called models on my host, which has 15 models downloaded from Hugging Face. How can I mount those models inside the Ollama Docker container so that I can use those instead of pulling and running models after we create the Ollama Docker container?

Originally created by @Vijaygawate on GitHub (May 9, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10628 Hello, I have a directory called models on my host, which has 15 models downloaded from Hugging Face. How can I mount those models inside the Ollama Docker container so that I can use those instead of pulling and running models after we create the Ollama Docker container?
GiteaMirror added the feature request label 2026-04-12 18:53:10 -05:00
Author
Owner

@rick-github commented on GitHub (May 9, 2025):

ollama uses a different naming scheme than HF. You will have to create the ollama model hierarchy and then link the HF GGUF files and create the ollama template and params files, and then mount it in the docker container with a volumes stanza.

<!-- gh-comment-id:2865794067 --> @rick-github commented on GitHub (May 9, 2025): ollama uses a different naming scheme than HF. You will have to create the ollama model hierarchy and then link the HF GGUF files and create the ollama template and params files, and then mount it in the docker container with a `volumes` stanza.
Author
Owner

@Vijaygawate commented on GitHub (May 9, 2025):

@rick-github Thank you for the reply.
So you have any working document for above use case? or any blog that I can refer

<!-- gh-comment-id:2865798755 --> @Vijaygawate commented on GitHub (May 9, 2025): @rick-github Thank you for the reply. So you have any working document for above use case? or any blog that I can refer
Author
Owner

@rick-github commented on GitHub (May 9, 2025):

Here is an example of taking the ollama model hierarchy and turning it into a format suitable for inference engines that use the HF model formats, so the reverse of what you want to do, but shows the basic mechanism.

<!-- gh-comment-id:2865828047 --> @rick-github commented on GitHub (May 9, 2025): [Here](https://github.com/ollama/ollama/issues/8466#issuecomment-2597731833) is an example of taking the ollama model hierarchy and turning it into a format suitable for inference engines that use the HF model formats, so the reverse of what you want to do, but shows the basic mechanism.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6992