[GH-ISSUE #12092] Ollama doesn't support InternVL3_5 14B-Q6_K_L gguf #8034

Closed
opened 2026-04-12 20:16:47 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @Elgokoo on GitHub (Aug 26, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12092

What is the issue?

Hi,

I have downloaded this new model : InternVL3_5 14B-Q6_K_L gguf , but the vision capabilities doesn't work while gemma 3b does , I'm using Open WebUI , but I'm sure it doesn't come from it .

Link : https://huggingface.co/bartowski/OpenGVLab_InternVL3_5-14B-GGUF/blob/main/OpenGVLab_InternVL3_5-14B-Q6_K_L.gguf

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @Elgokoo on GitHub (Aug 26, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12092 ### What is the issue? Hi, I have downloaded this new model : InternVL3_5 14B-Q6_K_L gguf , but the vision capabilities doesn't work while gemma 3b does , I'm using Open WebUI , but I'm sure it doesn't come from it . Link : https://huggingface.co/bartowski/OpenGVLab_InternVL3_5-14B-GGUF/blob/main/OpenGVLab_InternVL3_5-14B-Q6_K_L.gguf ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-12 20:16:47 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8034