[GH-ISSUE #6721] Error loading model architecture for miniCPM3-4B: Unknown architecture 'minicpm3' #66270

Closed
opened 2026-05-04 01:47:47 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @ChuiyuWang1 on GitHub (Sep 10, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6721

Hello Ollama team,

First of all, I want to express my appreciation for the amazing work you’ve done with Ollama. The tool has been incredibly helpful, and I hope your team continues to thrive and build even more powerful features!

I’ve encountered an issue while trying to use the miniCPM3-4B model with Ollama for inference.

Since there is no miniCPM3-4B model in ollama library, I used yefx/minicpm3_4b and shibing624/minicpm3_4b. When I perform inference using these models, the error shows

llama_model_load: error loading model: error loading model architecture: unknown model architecture: 'minicpm3'

I'm using ollama version 0.3.10 and Windows 11 OS.

I have two questions here:

  1. Does Ollama currently support the miniCPM3-4B model architecture?
  2. If not, are there any plans to support it, or is there a workaround that would allow me to use this model with Ollama?

Thank you for your time and for any guidance you can provide!

Originally created by @ChuiyuWang1 on GitHub (Sep 10, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6721 Hello Ollama team, First of all, I want to express my appreciation for the amazing work you’ve done with Ollama. The tool has been incredibly helpful, and I hope your team continues to thrive and build even more powerful features! I’ve encountered an issue while trying to use the miniCPM3-4B model with Ollama for inference. Since there is no miniCPM3-4B model in ollama library, I used `yefx/minicpm3_4b` and `shibing624/minicpm3_4b`. When I perform inference using these models, the error shows `llama_model_load: error loading model: error loading model architecture: unknown model architecture: 'minicpm3'` I'm using ollama version 0.3.10 and Windows 11 OS. I have two questions here: 1. Does Ollama currently support the miniCPM3-4B model architecture? 2. If not, are there any plans to support it, or is there a workaround that would allow me to use this model with Ollama? Thank you for your time and for any guidance you can provide!
GiteaMirror added the model label 2026-05-04 01:47:47 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 10, 2024):

  1. Not yet.
  2. https://github.com/ggerganov/llama.cpp/pull/9322
<!-- gh-comment-id:2340035358 --> @rick-github commented on GitHub (Sep 10, 2024): 1. Not yet. 2. https://github.com/ggerganov/llama.cpp/pull/9322
Author
Owner

@jmorganca commented on GitHub (Sep 12, 2024):

Merging with https://github.com/ollama/ollama/issues/6722. Thanks @rick-github

<!-- gh-comment-id:2345064660 --> @jmorganca commented on GitHub (Sep 12, 2024): Merging with https://github.com/ollama/ollama/issues/6722. Thanks @rick-github
Author
Owner

@enryteam commented on GitHub (Oct 14, 2024):

0.3.13, the same error

<!-- gh-comment-id:2409659029 --> @enryteam commented on GitHub (Oct 14, 2024): 0.3.13, the same error
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66270