[GH-ISSUE #6175] Fail when calling ollama.embeddings function #3858

Closed
opened 2026-04-12 14:41:48 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @weixu-tf4 on GitHub (Aug 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6175

What is the issue?

I was follow the example by https://ollama.com/blog/embedding-models
But, it allways fail when running: response = ollama.embeddings(model='mxbai-embed-large', prompt=d)
I don't no WHY

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.3.3

Originally created by @weixu-tf4 on GitHub (Aug 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6175 ### What is the issue? I was follow the example by https://ollama.com/blog/embedding-models But, it allways fail when running: response = ollama.embeddings(model='mxbai-embed-large', prompt=d) I don't no WHY ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.3
GiteaMirror added the needs more infobug labels 2026-04-12 14:41:48 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 5, 2024):

If you can provide the code you are running, the error message you receive, and the server logs when the error occurred, it will make debugging easier.

<!-- gh-comment-id:2268683856 --> @rick-github commented on GitHub (Aug 5, 2024): If you can provide the code you are running, the error message you receive, and the [server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) when the error occurred, it will make debugging easier.
Author
Owner

@royjhan commented on GitHub (Aug 5, 2024):

I was unable to reproduce the issue, running the example below

import ollama
response = ollama.embeddings(model='mxbai-embed-large', prompt="hello")
print(response)

Can you provide more information on the error? Namely the exact code you are running and the error message.

<!-- gh-comment-id:2269618600 --> @royjhan commented on GitHub (Aug 5, 2024): I was unable to reproduce the issue, running the example below >>> import ollama >>> response = ollama.embeddings(model='mxbai-embed-large', prompt="hello") >>> print(response) Can you provide more information on the error? Namely the exact code you are running and the error message.
Author
Owner

@jmorganca commented on GitHub (Sep 2, 2024):

Hi @weixu-tf4 thanks for the issue. It will be hard to debug this issue to the finish line without more info. Feel free to add to this issue and I will re-open it so we can make sure you can generate embeddings :)

<!-- gh-comment-id:2325380848 --> @jmorganca commented on GitHub (Sep 2, 2024): Hi @weixu-tf4 thanks for the issue. It will be hard to debug this issue to the finish line without more info. Feel free to add to this issue and I will re-open it so we can make sure you can generate embeddings :)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3858