[GH-ISSUE #15915] Please add Granite 4.1, Llama 4, and Mistral Medium 3.5 to Ollama Cloud #72196

Open
opened 2026-05-05 03:37:17 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @rafi339 on GitHub (May 1, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15915

Hey Ollama team!

I'm Rafi, a 13-year-old ML developer from Germany. I build stuff with PyTorch, OpenGL/C++, and HTML/CSS/JavaScript — and I use Ollama Cloud Pro every day for my projects. I'm thinking about switching to yearly Pro if I get more model choices.

Could you please add these models as cloud variants? They're all already on Ollama locally, just need the cloud versions:

  1. IBM Granite 4.1 (3b, 8b, 30b) — just dropped a few days ago, looks solid for enterprise stuff
  2. Mistral Medium 3.5 (128B) — Mistral's newest flagship, 256K context, great for coding agents
  3. Llama 4 (Scout & Maverick) — super popular, already 1.6M+ downloads on Ollama local

I'd love to run these on Cloud since my machine can't handle the bigger ones. Would be awesome to have more options up there!

Thanks!
P.S. Also thanks for making an amazing open-source project. I love the cloud models because the usage limits are great, it's fast enough for me, and I like open-weight AI.

Originally created by @rafi339 on GitHub (May 1, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15915 Hey Ollama team! I'm Rafi, a 13-year-old ML developer from Germany. I build stuff with PyTorch, OpenGL/C++, and HTML/CSS/JavaScript — and I use Ollama Cloud Pro every day for my projects. I'm thinking about switching to yearly Pro if I get more model choices. Could you please add these models as cloud variants? They're all already on Ollama locally, just need the cloud versions: 1. IBM Granite 4.1 (3b, 8b, 30b) — just dropped a few days ago, looks solid for enterprise stuff 2. Mistral Medium 3.5 (128B) — Mistral's newest flagship, 256K context, great for coding agents 3. Llama 4 (Scout & Maverick) — super popular, already 1.6M+ downloads on Ollama local I'd love to run these on Cloud since my machine can't handle the bigger ones. Would be awesome to have more options up there! Thanks! P.S. Also thanks for making an amazing open-source project. I love the cloud models because the usage limits are great, it's fast enough for me, and I like open-weight AI.
Author
Owner

@AccidentalJedi commented on GitHub (May 1, 2026):

I too would like to access and use these via cloud... or at least the largest of each ones respective family

<!-- gh-comment-id:4360411265 --> @AccidentalJedi commented on GitHub (May 1, 2026): I too would like to access and use these via cloud... or at least the largest of each ones respective family
Author
Owner

@eitelnick commented on GitHub (May 3, 2026):

+1 for Mistral Medium 3.5 on cloud.

<!-- gh-comment-id:4366380633 --> @eitelnick commented on GitHub (May 3, 2026): +1 for Mistral Medium 3.5 on cloud.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#72196