[GH-ISSUE #11443] Regression - unsupported quantization type Q3_K_S . #54069

Closed
opened 2026-04-29 05:10:45 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @mirage335 on GitHub (Jul 16, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11443

What is the issue?

This used to work, and was part of getting mistral vision to work on laptops with 16GB VRAM .

https://github.com/ollama/ollama/issues/10393

Unfortunately:

ollama create -q q3_k_s mistral-small3.2:24b-instruct-2506-virtuoso -f Modelfile
Error: unsupported quantization type Q3_K_S - supported types are F32, F16, Q4_K_S, Q4_K_M, Q8_0

Very unhelpful!

Relevant log output


OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

ollama version is 0.9.7-rc0

Originally created by @mirage335 on GitHub (Jul 16, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11443 ### What is the issue? This used to work, and was part of getting mistral vision to work on laptops with 16GB VRAM . https://github.com/ollama/ollama/issues/10393 Unfortunately: ```bash ollama create -q q3_k_s mistral-small3.2:24b-instruct-2506-virtuoso -f Modelfile ``` ``` Error: unsupported quantization type Q3_K_S - supported types are F32, F16, Q4_K_S, Q4_K_M, Q8_0 ``` Very unhelpful! ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version ollama version is 0.9.7-rc0
GiteaMirror added the bug label 2026-04-29 05:10:45 -05:00
Author
Owner

@rick-github commented on GitHub (Jul 21, 2025):

https://github.com/ollama/ollama/pull/10647#issuecomment-2873563847

<!-- gh-comment-id:3096702847 --> @rick-github commented on GitHub (Jul 21, 2025): https://github.com/ollama/ollama/pull/10647#issuecomment-2873563847
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#54069