[GH-ISSUE #10461] unable to load model #32639

Closed
opened 2026-04-22 14:16:37 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @skytraveler on GitHub (Apr 29, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10461

What is the issue?

I downloaded qwen3:32b model ,but can not load
unable to load model: /Users/sky/.ollama/models/blobs/sha256-3291abe70f16ee9682de7bfae08db5373ea9d6497e614aaad63340ad421d6312

it is supposed that ollama support qwen3, am I right?

Relevant log output


OS

macOS

GPU

Apple

CPU

Apple

Ollama version

ollama version is 0.6.0

Originally created by @skytraveler on GitHub (Apr 29, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10461 ### What is the issue? I downloaded qwen3:32b model ,but can not load unable to load model: /Users/sky/.ollama/models/blobs/sha256-3291abe70f16ee9682de7bfae08db5373ea9d6497e614aaad63340ad421d6312 it is supposed that ollama support qwen3, am I right? ### Relevant log output ```shell ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version ollama version is 0.6.0
GiteaMirror added the bug label 2026-04-22 14:16:37 -05:00
Author
Owner

@Ol1ver0413 commented on GitHub (Apr 29, 2025):

I have the same problem. My ollama version is 0.6.4. I may go to try high level ollama.

<!-- gh-comment-id:2837286650 --> @Ol1ver0413 commented on GitHub (Apr 29, 2025): I have the same problem. My ollama version is 0.6.4. I may go to try high level ollama.
Author
Owner

@skytraveler commented on GitHub (Apr 29, 2025):

my hardware is m4pro chip. 64g mem

<!-- gh-comment-id:2837287360 --> @skytraveler commented on GitHub (Apr 29, 2025): my hardware is m4pro chip. 64g mem
Author
Owner

@litetoooooom commented on GitHub (Apr 29, 2025):

+1

<!-- gh-comment-id:2837294300 --> @litetoooooom commented on GitHub (Apr 29, 2025): +1
Author
Owner

@skytraveler commented on GitHub (Apr 29, 2025):

problem solved after updating ollama to 0.6.6
ollama version is 0.6.6

<!-- gh-comment-id:2837295052 --> @skytraveler commented on GitHub (Apr 29, 2025): problem solved after updating ollama to 0.6.6 ollama version is 0.6.6
Author
Owner

@skytraveler commented on GitHub (Apr 29, 2025):

+1

update your ollama

<!-- gh-comment-id:2837295474 --> @skytraveler commented on GitHub (Apr 29, 2025): > +1 update your ollama
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#32639