[GH-ISSUE #13760] Support importing local GGUF models from llama.cpp / LM Studio #34779

Closed
opened 2026-04-22 18:36:57 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @chyyl on GitHub (Jan 17, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/13760

Many users already have GGUF models (e.g., from TheBloke) used in llama.cpp or LM Studio. Currently, Ollama requires re-downloading and repackaging them via Modelfiles—even though it’s built on the same foundation.

Please add a way to import local GGUF files safely, either via:

ollama import --file ./model.gguf --name my-model

or by allowing FROM ./model.gguf in Modelfiles.

This avoids duplication, improves ecosystem interoperability, and respects Ollama’s metadata model (templates, stop tokens, etc.) without compromising security.

Originally created by @chyyl on GitHub (Jan 17, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/13760 Many users already have GGUF models (e.g., from TheBloke) used in llama.cpp or LM Studio. Currently, Ollama requires re-downloading and repackaging them via Modelfiles—even though it’s built on the same foundation. Please add a way to **import local GGUF files** safely, either via: ```bash ollama import --file ./model.gguf --name my-model ``` or by allowing `FROM ./model.gguf` in Modelfiles. This avoids duplication, improves ecosystem interoperability, and respects Ollama’s metadata model (templates, stop tokens, etc.) without compromising security.
GiteaMirror added the feature request label 2026-04-22 18:36:57 -05:00
Author
Owner

@chllei commented on GitHub (Jan 17, 2026):

This functionality has been available in Ollama for a long time. Please refer to doc.

<!-- gh-comment-id:3763966906 --> @chllei commented on GitHub (Jan 17, 2026): This functionality has been available in Ollama for a long time. Please refer to [doc](https://docs.ollama.com/modelfile#build-from-a-gguf-file).
Author
Owner

@chllei commented on GitHub (Jan 17, 2026):

However, Ollama currently only supports text models in the gguf format. I hope that the Ollama team will soon support the import of multimodal gguf.

<!-- gh-comment-id:3763971313 --> @chllei commented on GitHub (Jan 17, 2026): However, Ollama currently only supports text models in the gguf format. I hope that the Ollama team will soon support the import of multimodal gguf.
Author
Owner

@rick-github commented on GitHub (Jan 17, 2026):

Ollama supports importing vision models in GGUF format, just specify the text and vision files. Other modes like speech and video require infrastructure changes.

<!-- gh-comment-id:3764102524 --> @rick-github commented on GitHub (Jan 17, 2026): Ollama supports importing vision models in GGUF format, just specify the text and vision files. Other modes like speech and video require infrastructure changes.
Author
Owner

@pdevine commented on GitHub (Jan 20, 2026):

The problem w/ the vision models (and other modalities) is when someone changes the implementation in llama.cpp with different KV or tensor names than what our model implementations expect. We try to guess what the optimal names should be, but it's really dependent on the model implementations.

I'm going to go ahead and close the issue since I don't think it's possible to make this a "generic" thing (i.e. have it work 100% of the time for all models), and we have to do it on a case-by-case basis.

<!-- gh-comment-id:3770543960 --> @pdevine commented on GitHub (Jan 20, 2026): The problem w/ the vision models (and other modalities) is when someone changes the implementation in llama.cpp with different KV or tensor names than what our model implementations expect. We try to guess what the optimal names should be, but it's really dependent on the model implementations. I'm going to go ahead and close the issue since I don't think it's possible to make this a "generic" thing (i.e. have it work 100% of the time for all models), and we have to do it on a case-by-case basis.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#34779