[GH-ISSUE #10499] How Do I Deploy bge-reranker-v2-m3 #6907

Closed
opened 2026-04-12 18:47:46 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @Nyoko74 on GitHub (Apr 30, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10499

What is the issue?

An error is reported when the bge-reranker-v2-m3 model from the huggingface is deployed.https://huggingface.co/BAAI/bge-reranker-v2-m3

converting model Error: unsupported architecture
I know ollama's model library provides this model. But for some reason, I can't get the model using ollama pull, what do I do to deploy this?

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @Nyoko74 on GitHub (Apr 30, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10499 ### What is the issue? An error is reported when the bge-reranker-v2-m3 model from the huggingface is deployed.[https://huggingface.co/BAAI/bge-reranker-v2-m3](url) `converting model Error: unsupported architecture ` I know ollama's model library provides this model. But for some reason, I can't get the model using` ollama pull,` what do I do to deploy this? ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-12 18:47:46 -05:00
Author
Owner

@rick-github commented on GitHub (Apr 30, 2025):

Ollama doesn't currently support ranking models. https://github.com/ollama/ollama/issues/3368

<!-- gh-comment-id:2841399712 --> @rick-github commented on GitHub (Apr 30, 2025): Ollama doesn't currently support ranking models. https://github.com/ollama/ollama/issues/3368
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6907