[GH-ISSUE #14476] need qwen3.5-flash #71448

Closed
opened 2026-05-05 01:45:48 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @pamdla on GitHub (Feb 26, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14476

I found this link https://huggingface.co/Qwen/Qwen3.5-35B-A3B mentioned Qwen3.5-flash model or version,
could you plan to add this flash model?

Or have you already developed qwen3.5-flash model?

Originally created by @pamdla on GitHub (Feb 26, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14476 I found this link https://huggingface.co/Qwen/Qwen3.5-35B-A3B mentioned Qwen3.5-flash model or version, could you plan to add this flash model? Or have you already developed qwen3.5-flash model?
Author
Owner

@resc863 commented on GitHub (Feb 26, 2026):

I believe Qwen-3.5-Flash is Alibaba Cloud only model

<!-- gh-comment-id:3969894666 --> @resc863 commented on GitHub (Feb 26, 2026): I believe Qwen-3.5-Flash is Alibaba Cloud only model
Author
Owner

@rick-github commented on GitHub (Feb 27, 2026):

I believe Qwen-3.5-Flash is Alibaba Cloud only model

This is correct. It's the same model as Qwen3.5-35B-A3B but with larger context and inbuilt tools.

<!-- gh-comment-id:3970064613 --> @rick-github commented on GitHub (Feb 27, 2026): > I believe Qwen-3.5-Flash is Alibaba Cloud only model This is correct. It's the same model as Qwen3.5-35B-A3B but with larger context and inbuilt tools.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71448