[GH-ISSUE #5167] Unable to set "encoding_format" and "dimensions" parameters for the "mxbai-embed-large" #29016

Open
opened 2026-04-22 07:36:20 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @netandreus on GitHub (Jun 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5167

What is the issue?

This is great that Ollama has an mxbai-embed-large embedding model. I am trying to use this model with "ubinary" encoding_format and 512 dimensions like this (according to this blog post):

import { MixedbreadAIClient } from "@mixedbread-ai/sdk";

const mxbai = new MixedbreadAIClient({
  apiKey: "{MIXEDBREAD_API_KEY}"
});

const res = await mxbai.embeddings({
  model: 'mixedbread-ai/mxbai-embed-large-v1',
  input: [
    'Who is german and likes bread?',
    'Everybody in Germany.'
  ],
  normalized: true, // this has to be True if you want to use binary with faiss
  encoding_format: 'ubinary',
  dimensions=512
})

but with local Ollama server. I am confused, that there are no these parameters in model:

{
    "num_ctx": 512
}

Can you please add them? It will be very usefull for Matryoshka Representation Learning.

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.1.43

Originally created by @netandreus on GitHub (Jun 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5167 ### What is the issue? This is great that Ollama has an [mxbai-embed-large](https://ollama.com/library/mxbai-embed-large:latest/blobs/b837481ff855) embedding model. I am trying to use this model with "ubinary" encoding_format and 512 dimensions like this (according to [this blog post](https://www.mixedbread.ai/blog/binary-mrl)): ``` import { MixedbreadAIClient } from "@mixedbread-ai/sdk"; const mxbai = new MixedbreadAIClient({ apiKey: "{MIXEDBREAD_API_KEY}" }); const res = await mxbai.embeddings({ model: 'mixedbread-ai/mxbai-embed-large-v1', input: [ 'Who is german and likes bread?', 'Everybody in Germany.' ], normalized: true, // this has to be True if you want to use binary with faiss encoding_format: 'ubinary', dimensions=512 }) ``` but with local Ollama server. I am confused, that [there are no these parameters](https://ollama.com/library/mxbai-embed-large:latest/blobs/b837481ff855) in model: ``` { "num_ctx": 512 } ``` Can you please add them? It will be very usefull for Matryoshka Representation Learning. ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.1.43
GiteaMirror added the bug label 2026-04-22 07:36:20 -05:00
Author
Owner

@ralucamb88 commented on GitHub (Mar 20, 2025):

I have the same issue. Is this in progress to be solved? Apparently changing the encoding_format from the default "float" to binary or ubinary is not taking effect. Only possible by testing with python SentenceTransformers. Thanks

<!-- gh-comment-id:2739613857 --> @ralucamb88 commented on GitHub (Mar 20, 2025): I have the same issue. Is this in progress to be solved? Apparently changing the encoding_format from the default "float" to binary or ubinary is not taking effect. Only possible by testing with python SentenceTransformers. Thanks
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29016