[GH-ISSUE #7946] this model is not supported by your version of Ollama. You may need to upgrade #30847

Closed
opened 2026-04-22 10:47:32 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @s313627345 on GitHub (Dec 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7946

What is the issue?

ollama load gguf error is "this model is not supported by your version of Ollama. You may need to upgrade" who can help me?

OS

Windows

GPU

AMD

CPU

Intel

Ollama version

0.4.7

Originally created by @s313627345 on GitHub (Dec 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7946 ### What is the issue? ollama load gguf error is "this model is not supported by your version of Ollama. You may need to upgrade" who can help me? ### OS Windows ### GPU AMD ### CPU Intel ### Ollama version 0.4.7
GiteaMirror added the bug label 2026-04-22 10:47:33 -05:00
Author
Owner

@s313627345 commented on GitHub (Dec 5, 2024):

image

<!-- gh-comment-id:2519446849 --> @s313627345 commented on GitHub (Dec 5, 2024): ![image](https://github.com/user-attachments/assets/88d32134-7cac-4555-8c08-05c832b34ebb)
Author
Owner

@rick-github commented on GitHub (Dec 5, 2024):

Where did you get the model from?

<!-- gh-comment-id:2519690357 --> @rick-github commented on GitHub (Dec 5, 2024): Where did you get the model from?
Author
Owner

@pdevine commented on GitHub (Dec 6, 2024):

qwen2-vl isn't supported just quite yet. I'll go ahead and close the issue since there's already #6564

<!-- gh-comment-id:2521838119 --> @pdevine commented on GitHub (Dec 6, 2024): `qwen2-vl` isn't supported just quite yet. I'll go ahead and close the issue since there's already #6564
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#30847