[GH-ISSUE #10558] Error: unsupported architecture "LlavaForConditionalGeneration" #69008

Closed
opened 2026-05-04 16:46:16 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @2201925235 on GitHub (May 4, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10558

This error is displayed when importing an Ollama model using fine-tuned trained Llava

Image

Originally created by @2201925235 on GitHub (May 4, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10558 This error is displayed when importing an Ollama model using fine-tuned trained Llava ![Image](https://github.com/user-attachments/assets/891004c0-653f-43e1-9708-0399558e7fa7)
GiteaMirror added the model label 2026-05-04 16:46:17 -05:00
Author
Owner

@rick-github commented on GitHub (May 4, 2025):

The ollama import function only supports a subset of architectures. For un-supported models, you can use llama.cpp to convert to GGUF and then import that.

<!-- gh-comment-id:2849194022 --> @rick-github commented on GitHub (May 4, 2025): The ollama import function only supports a subset of architectures. For un-supported models, you can use [llama.cpp](https://github.com/ggml-org/llama.cpp/blob/master/convert_hf_to_gguf_update.py) to convert to GGUF and then import that.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#69008