[GH-ISSUE #12535] The request for adapting Minicpm-v 4.5 and qwen3-VL models to the new version of ollama #70376

Closed
opened 2026-05-04 21:18:45 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @pagesys on GitHub (Oct 8, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12535

Dear Development Team,

With the highest respect, we extend our sincerest greetings. You have brought first-class deployment experiences and infrastructure to the open-source community. For some time, there has been a certain delay in the adaptation work for new models. Here, we would like to make a new request to the development team: please consider updating and adding support for the Minicpm-v 4.5 model in the next release, while also accelerating the work on the qwen3-VL series models. The latest open-source SOTA (State-of-the-Art) has arrived.

As far as we know, the llama.cpp team has officially added support for Minicpm-v 4.5 in their previous release. Due to the significant popularity of OpenBMB’s edge-side multimodal large models and the fact that OpenBMB has already submitted a PR branch request for adaptation to ollama, we, as an IBM-approved AI expert, hope the team can prioritize and consider timely support for the new Minicpm-v 4.5 model.

With utmost regards,
Pengcheng Zhao
https://github.com/ggml-org/llama.cpp/pull/15575
https://github.com/ollama/ollama/pull/12078

Originally created by @pagesys on GitHub (Oct 8, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12535 Dear Development Team, With the highest respect, we extend our sincerest greetings. You have brought first-class deployment experiences and infrastructure to the open-source community. For some time, there has been a certain delay in the adaptation work for new models. Here, we would like to make a new request to the development team: please consider updating and adding support for the Minicpm-v 4.5 model in the next release, while also accelerating the work on the qwen3-VL series models. The latest open-source SOTA (State-of-the-Art) has arrived. As far as we know, the llama.cpp team has officially added support for Minicpm-v 4.5 in their previous release. Due to the significant popularity of OpenBMB’s edge-side multimodal large models and the fact that OpenBMB has already submitted a PR branch request for adaptation to ollama, we, as an IBM-approved AI expert, hope the team can prioritize and consider timely support for the new Minicpm-v 4.5 model. With utmost regards, Pengcheng Zhao [https://github.com/ggml-org/llama.cpp/pull/15575](url) [https://github.com/ollama/ollama/pull/12078](url)
GiteaMirror added the model label 2026-05-04 21:18:45 -05:00
Author
Owner
<!-- gh-comment-id:3380661019 --> @rick-github commented on GitHub (Oct 8, 2025): https://github.com/ollama/ollama/issues/12137 https://github.com/ollama/ollama/issues/12397
Author
Owner

@pagesys commented on GitHub (Oct 8, 2025):

#12137 #12397
I have checked these issues. If I'm not mistaken, the support problem for Minicpm-v 4.5 will be resolved in the next version (0.12.4), right?
#12137

<!-- gh-comment-id:3380926310 --> @pagesys commented on GitHub (Oct 8, 2025): > [#12137](https://github.com/ollama/ollama/issues/12137) [#12397](https://github.com/ollama/ollama/issues/12397) I have checked these issues. If I'm not mistaken, the support problem for Minicpm-v 4.5 will be resolved in the next version (0.12.4), right? #12137
Author
Owner

@rick-github commented on GitHub (Oct 8, 2025):

Correct.

<!-- gh-comment-id:3380932131 --> @rick-github commented on GitHub (Oct 8, 2025): Correct.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#70376