[GH-ISSUE #12923] Erro to run BitNet 1.58 model in Ollama #34330

Closed
opened 2026-04-22 17:47:21 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @faelp22 on GitHub (Nov 3, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12923

🙌 Thank you, Ollama Desktop Team!

First off, I want to express my appreciation for the incredible work you're doing with Ollama Desktop. It's exciting to see how accessible and powerful local model execution has become thanks to your platform.

I'm reporting an issue I encountered while attempting to run a GGUF model. I hope this helps improve compatibility and user experience for others working with similar setups.

The error occurs when trying to run the model hf.co/microsoft/bitnet-b1.58-2B-4T-gguf using Ollama Desktop.

Image

https://huggingface.co/microsoft/bitnet-b1.58-2B-4T-gguf

Reproduce
ollama run hf.co/microsoft/bitnet-b1.58-2B-4T-gguf

Thank you for your attention.

Relevant log output

Error: 500 Internal Server Error: unable to load model: C:\Users\xxxxx\.ollama\models\blobs\sha256-4221b252fdd5fd25e15847adfeb5ee88886506ba50b8a34548374492884c2162

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.12.9

Originally created by @faelp22 on GitHub (Nov 3, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12923 🙌 Thank you, Ollama Desktop Team! First off, I want to express my appreciation for the incredible work you're doing with Ollama Desktop. It's exciting to see how accessible and powerful local model execution has become thanks to your platform. I'm reporting an issue I encountered while attempting to run a GGUF model. I hope this helps improve compatibility and user experience for others working with similar setups. The error occurs when trying to run the model hf.co/microsoft/bitnet-b1.58-2B-4T-gguf using Ollama Desktop. <img width="1016" height="677" alt="Image" src="https://github.com/user-attachments/assets/380f1fac-73c9-4fbf-8ec9-2215bc2e9765" /> https://huggingface.co/microsoft/bitnet-b1.58-2B-4T-gguf Reproduce ollama run hf.co/microsoft/bitnet-b1.58-2B-4T-gguf Thank you for your attention. ### Relevant log output ```shell Error: 500 Internal Server Error: unable to load model: C:\Users\xxxxx\.ollama\models\blobs\sha256-4221b252fdd5fd25e15847adfeb5ee88886506ba50b8a34548374492884c2162 ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.12.9
GiteaMirror added the bug label 2026-04-22 17:47:21 -05:00
Author
Owner

@rick-github commented on GitHub (Nov 3, 2025):

BitNet is not a supported architecture.

#10337

<!-- gh-comment-id:3480258391 --> @rick-github commented on GitHub (Nov 3, 2025): BitNet is not a supported architecture. #10337
Author
Owner

@pdevine commented on GitHub (Nov 4, 2025):

Going to close this as a dupe.

<!-- gh-comment-id:3483261579 --> @pdevine commented on GitHub (Nov 4, 2025): Going to close this as a dupe.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#34330