[GH-ISSUE #6555] /api/embed returns empty embeddings in docker environment #50635

Closed
opened 2026-04-28 16:41:30 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @smoothdvd on GitHub (Aug 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6555

What is the issue?

curl http://localhost:11434/api/embed -d '{
  "model": "bge-m3",
  "prompt": "Llamas are members of the camelid family"
}'
{"model":"bge-m3","embeddings":[]}%

but old /api/embedding endpoint still worked

OS

Docker

GPU

Nvidia

CPU

Intel

Ollama version

0.3.8

Originally created by @smoothdvd on GitHub (Aug 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6555 ### What is the issue? ``` curl http://localhost:11434/api/embed -d '{ "model": "bge-m3", "prompt": "Llamas are members of the camelid family" }' {"model":"bge-m3","embeddings":[]}% ``` but old /api/embedding endpoint still worked ### OS Docker ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.8
GiteaMirror added the bug label 2026-04-28 16:41:30 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 29, 2024):

/api/embed uses input not prompt.

<!-- gh-comment-id:2317744384 --> @rick-github commented on GitHub (Aug 29, 2024): [/api/embed](https://github.com/ollama/ollama/blob/main/docs/api.md#generate-embeddings) uses `input` not `prompt`.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#50635