[GH-ISSUE #11109] Please support official Qwen3 MLX version for MacOS #33089

Closed
opened 2026-04-22 15:20:25 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @mzh1996 on GitHub (Jun 18, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11109

Recently, Qwen3-MLX models are released.

Image

So, can I use such models (e.g. Qwen3-32B-MLX-bf16) in ollama on my Macbook Pro?

Will ollama automatically download the official Qwen3 MLX models if it is run on the devices with M-series chip (M1-M4)?

Originally created by @mzh1996 on GitHub (Jun 18, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11109 Recently, Qwen3-MLX models are released. ![Image](https://github.com/user-attachments/assets/33a41524-4a63-464f-808f-eae04979694a) So, can I use such models (e.g. Qwen3-32B-MLX-bf16) in ollama on my Macbook Pro? Will ollama automatically download the official Qwen3 MLX models if it is run on the devices with M-series chip (M1-M4)?
GiteaMirror added the model label 2026-04-22 15:20:25 -05:00
Author
Owner

@rick-github commented on GitHub (Jun 18, 2025):

So, can I use such models (e.g. Qwen3-32B-MLX-bf16) in ollama on my Macbook Pro?

No, ollama doesn't currently support MLX. https://github.com/ollama/ollama/issues/1730

<!-- gh-comment-id:2983266025 --> @rick-github commented on GitHub (Jun 18, 2025): > So, can I use such models (e.g. Qwen3-32B-MLX-bf16) in ollama on my Macbook Pro? No, ollama doesn't currently support MLX. https://github.com/ollama/ollama/issues/1730
Author
Owner

@mzh1996 commented on GitHub (Jun 18, 2025):

So, can I use such models (e.g. Qwen3-32B-MLX-bf16) in ollama on my Macbook Pro?

No, ollama doesn't currently support MLX. #1730

Thank you for your quick response!

Looking forward to MLX support in this project in the future.

<!-- gh-comment-id:2983278166 --> @mzh1996 commented on GitHub (Jun 18, 2025): > > So, can I use such models (e.g. Qwen3-32B-MLX-bf16) in ollama on my Macbook Pro? > > No, ollama doesn't currently support MLX. [#1730](https://github.com/ollama/ollama/issues/1730) Thank you for your quick response! Looking forward to MLX support in this project in the future.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#33089