[GH-ISSUE #4015] Add support for Qwen-VL #49001

Closed
opened 2026-04-28 10:35:28 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @dagehuifei on GitHub (Apr 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4015

https://huggingface.co/Qwen/Qwen-VL

Originally created by @dagehuifei on GitHub (Apr 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4015 https://huggingface.co/Qwen/Qwen-VL
GiteaMirror added the feature request label 2026-04-28 10:35:28 -05:00
Author
Owner

@thinkverse commented on GitHub (Apr 29, 2024):

That model cannot be added currently, llama.cpp doesn't support Qwen-VL https://github.com/ggerganov/llama.cpp/issues/5331, and there doesn't seem to be much traction on getting it supported either.

<!-- gh-comment-id:2081815472 --> @thinkverse commented on GitHub (Apr 29, 2024): That model cannot be added currently, llama.cpp doesn't support Qwen-VL https://github.com/ggerganov/llama.cpp/issues/5331, and there doesn't seem to be much traction on getting it supported either.
Author
Owner

@rick-github commented on GitHub (Jul 15, 2025):

https://ollama.com/library/qwen2.5vl

<!-- gh-comment-id:3072565802 --> @rick-github commented on GitHub (Jul 15, 2025): https://ollama.com/library/qwen2.5vl
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#49001