[PR #10280] [CLOSED] support minicpm-omni #75490

Closed
opened 2026-05-05 07:55:02 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/10280
Author: @tc-mb
Created: 4/15/2025
Status: Closed

Base: mainHead: support-minicpm-v-in-ollama-engine


📝 Commits (4)

  • 3c4816b init llm
  • de83613 Merge branch 'ollama:main' into support-minicpm-v-in-ollama-engine
  • de5994e Merge branch 'support-minicpm-v-in-ollama-engine' into main
  • 8182a73 Merge pull request #27 from tc-mb/main

📊 Changes

4 files changed (+251 additions, -0 deletions)

View changed files

📝 convert/convert.go (+2 -0)
convert/convert_minicpm_o_2_6 (+96 -0)
model/models/minicpm_o2.6/model_text.go (+152 -0)
📝 model/models/models.go (+1 -0)

📄 Description

I previously supported ollama to adapt to minicpm-omni by modifying the llama.cpp backend(https://github.com/ollama/ollama/pull/9672), but the ollama community is replacing the new engine, and I will gradually re-implement the minicpm-v series models with the new engine. This does require a certain amount of work, and I will continue to update this PR.

  1. I have a preliminary understanding of the logic of the new engine. And added the llm part, unfortunately, I found that minicpm-omni is a multimodal model that depends on the qwen2.5 base, which depends on the merger of qwen2.

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/10280 **Author:** [@tc-mb](https://github.com/tc-mb) **Created:** 4/15/2025 **Status:** ❌ Closed **Base:** `main` ← **Head:** `support-minicpm-v-in-ollama-engine` --- ### 📝 Commits (4) - [`3c4816b`](https://github.com/ollama/ollama/commit/3c4816b6be9e1e589c2f8a9dc399c57a29130a5b) init llm - [`de83613`](https://github.com/ollama/ollama/commit/de836136d25b8c70fd2aeceb6a7587541118fa68) Merge branch 'ollama:main' into support-minicpm-v-in-ollama-engine - [`de5994e`](https://github.com/ollama/ollama/commit/de5994e95d2fe5361e5821dc4accb6d1217f9b74) Merge branch 'support-minicpm-v-in-ollama-engine' into main - [`8182a73`](https://github.com/ollama/ollama/commit/8182a733a1b9852b2cbabba420407c27cd264b17) Merge pull request #27 from tc-mb/main ### 📊 Changes **4 files changed** (+251 additions, -0 deletions) <details> <summary>View changed files</summary> 📝 `convert/convert.go` (+2 -0) ➕ `convert/convert_minicpm_o_2_6` (+96 -0) ➕ `model/models/minicpm_o2.6/model_text.go` (+152 -0) 📝 `model/models/models.go` (+1 -0) </details> ### 📄 Description I previously supported ollama to adapt to minicpm-omni by modifying the llama.cpp backend(https://github.com/ollama/ollama/pull/9672), but the ollama community is replacing the new engine, and I will gradually re-implement the minicpm-v series models with the new engine. This does require a certain amount of work, and I will continue to update this PR. 1. I have a preliminary understanding of the logic of the new engine. And added the llm part, unfortunately, I found that minicpm-omni is a multimodal model that depends on the qwen2.5 base, which depends on the merger of qwen2. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-05-05 07:55:02 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#75490