[GH-ISSUE #13211] Windows GUI - Version 0.13.0 some models (gemma3:27b) does't accept images #34494

Open
opened 2026-04-22 18:06:42 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @msvobodnikov on GitHub (Nov 23, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13211

Originally assigned to: @hoyyeva on GitHub.

What is the issue?

Problem with windows GUI. For gemma3:27b model, if I try to add image to message, I get error: "This model does not support images", If I change model to something else - like qwen3-vl, paste image, then change model back to gemma3, it works. So model is definetly ok (and I was using it in previous versions) but after last update, something is broken. Model works with images if I use CLI.

Relevant log output


OS

Windows

GPU

Nvidia

CPU

AMD

Ollama version

0.13.0

Originally created by @msvobodnikov on GitHub (Nov 23, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13211 Originally assigned to: @hoyyeva on GitHub. ### What is the issue? Problem with windows GUI. For gemma3:27b model, if I try to add image to message, I get error: "This model does not support images", If I change model to something else - like qwen3-vl, paste image, then change model back to gemma3, it works. So model is definetly ok (and I was using it in previous versions) but after last update, something is broken. Model works with images if I use CLI. ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.13.0
GiteaMirror added the appbug labels 2026-04-22 18:06:42 -05:00
Author
Owner

@kiril1976 commented on GitHub (Nov 29, 2025):

I use MacBook os Tahoe 26 and all models give that error.

<!-- gh-comment-id:3591492248 --> @kiril1976 commented on GitHub (Nov 29, 2025): I use MacBook os Tahoe 26 and all models give that error.
Author
Owner

@boringbyte commented on GitHub (Dec 2, 2025):

I encountered same issue with gemma3: 4b and 12b but not with 27b parameter model. I'm using Ollama v0.13.0 on windows.

<!-- gh-comment-id:3602912222 --> @boringbyte commented on GitHub (Dec 2, 2025): I encountered same issue with gemma3: 4b and 12b but not with 27b parameter model. I'm using Ollama v0.13.0 on windows.
Author
Owner

@DiffuzionDreamer commented on GitHub (Dec 6, 2025):

had similar issue with qwen3vl

workaround is to run ollama serve in a separate CLI window and restart the client.

see https://github.com/ollama/ollama/issues/12851#issuecomment-3553913226

Hello everyone! We are currently working on a fix. In the meantime, as a temporarily solution if you can try running ollama serve in a different terminal and then restart the app it should work. We are working on a more robust solution. Sorry for the inconvenience, and we will update here once the fix is ready!

<!-- gh-comment-id:3619798946 --> @DiffuzionDreamer commented on GitHub (Dec 6, 2025): had similar issue with qwen3vl workaround is to run ollama serve in a separate CLI window and restart the client. see https://github.com/ollama/ollama/issues/12851#issuecomment-3553913226 > Hello everyone! We are currently working on a fix. In the meantime, as a temporarily solution if you can try running `ollama serve` in a different terminal and then restart the app it should work. We are working on a more robust solution. Sorry for the inconvenience, and we will update here once the fix is ready!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#34494