[GH-ISSUE #13398] Devstral-2-123B-Instruct-2512 Cloud + Local #70906

Closed
opened 2026-05-04 23:25:59 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @eitelnick on GitHub (Dec 9, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13398

Originally assigned to: @BruceMacD on GitHub.

Devstral-2-123B is too large for my local rig, would be nice to see this offered as part of the Cloud options.

Originally created by @eitelnick on GitHub (Dec 9, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13398 Originally assigned to: @BruceMacD on GitHub. [Devstral-2-123B](https://huggingface.co/mistralai/Devstral-2-123B-Instruct-2512) is too large for my local rig, would be nice to see this offered as part of the Cloud options.
GiteaMirror added the model label 2026-05-04 23:25:59 -05:00
Author
Owner

@seitzbg commented on GitHub (Dec 9, 2025):

https://huggingface.co/mistralai/Devstral-Small-2-24B-Instruct-2512

<!-- gh-comment-id:3633833761 --> @seitzbg commented on GitHub (Dec 9, 2025): https://huggingface.co/mistralai/Devstral-Small-2-24B-Instruct-2512
Author
Owner

@pdevine commented on GitHub (Dec 10, 2025):

We're close!

<!-- gh-comment-id:3635111516 --> @pdevine commented on GitHub (Dec 10, 2025): We're close!
Author
Owner

@maternion commented on GitHub (Dec 10, 2025):

We're close!

Would love the 24b available locally on ollama!

<!-- gh-comment-id:3635998872 --> @maternion commented on GitHub (Dec 10, 2025): > We're close! Would love the 24b available locally on ollama!
Author
Owner

@pdevine commented on GitHub (Dec 10, 2025):

@maternion That is close too!

<!-- gh-comment-id:3638480092 --> @pdevine commented on GitHub (Dec 10, 2025): @maternion That is close too!
Author
Owner

@IAteYourBacon commented on GitHub (Dec 12, 2025):

24B should run on a 3090, right?

<!-- gh-comment-id:3644802402 --> @IAteYourBacon commented on GitHub (Dec 12, 2025): 24B should run on a 3090, right?
Author
Owner

@pdevine commented on GitHub (Dec 12, 2025):

24B should run on a 3090, right?

I think w/ q4_K_M it should work?

Going to close this since it's shipped now.

<!-- gh-comment-id:3644890064 --> @pdevine commented on GitHub (Dec 12, 2025): > 24B should run on a 3090, right? I think w/ `q4_K_M` it should work? Going to close this since it's shipped now.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#70906