[GH-ISSUE #10673] Olmo 2 1b #7018

Closed
opened 2026-04-12 18:54:50 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @KTRosenberg on GitHub (May 12, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10673

Are there plans to add Olmo's new v2 1B on-device-capable models? This would be very useful for research.

Originally created by @KTRosenberg on GitHub (May 12, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10673 Are there plans to add Olmo's new v2 1B on-device-capable models? This would be very useful for research.
GiteaMirror added the model label 2026-04-12 18:54:50 -05:00
Author
Owner

@rick-github commented on GitHub (May 12, 2025):

ollama run hf.co/allenai/OLMo-2-0425-1B-Instruct-GGUF:Q4_K_M
<!-- gh-comment-id:2873090986 --> @rick-github commented on GitHub (May 12, 2025): ``` ollama run hf.co/allenai/OLMo-2-0425-1B-Instruct-GGUF:Q4_K_M ```
Author
Owner

@KTRosenberg commented on GitHub (May 12, 2025):

Thanks. It's not on the webpage, so I wasn't aware it existed.

<!-- gh-comment-id:2873170473 --> @KTRosenberg commented on GitHub (May 12, 2025): Thanks. It's not on the webpage, so I wasn't aware it existed.
Author
Owner

@rick-github commented on GitHub (May 12, 2025):

For a lot of HF models with a GGUF quant you can click on "Use this model" on the right hand side and it will show the ollama command required to download and run the model.

<!-- gh-comment-id:2873178854 --> @rick-github commented on GitHub (May 12, 2025): For a lot of HF models with a GGUF quant you can click on "Use this model" on the right hand side and it will show the ollama command required to download and run the model.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7018