[GH-ISSUE #8522] Ollama throws 'does not support generate' error on running embedding models on windows #5494

Closed
opened 2026-04-12 16:43:48 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @tanmaysharma2001 on GitHub (Jan 21, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8522

What is the issue?

Hi,
as the title says, when using ollama cli, and trying to running any embedding models present on the website (in this case nomic-embed-text), they throw an error which is:

Error: "nomic-embed-text" does not support generate

to reproduce:

  1. simply install ollama on windows through their website.
  2. run:
ollama run nomic-embed-text

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.5.7

Originally created by @tanmaysharma2001 on GitHub (Jan 21, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8522 ### What is the issue? Hi, as the title says, when using ollama cli, and trying to running any embedding models present on the website (in this case nomic-embed-text), they throw an error which is: ``` Error: "nomic-embed-text" does not support generate ``` to reproduce: 1. simply install ollama on windows through their website. 2. run: ``` ollama run nomic-embed-text ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.5.7
GiteaMirror added the bug label 2026-04-12 16:43:48 -05:00
Author
Owner

@rick-github commented on GitHub (Jan 21, 2025):

ollama cli doesn't do embeddings. Use the API.

curl localhost:11434/api/embed -d "{\"model\":\"nomic-embed-text\",\"input\":\"why is the sky blue?\"}"
<!-- gh-comment-id:2605883687 --> @rick-github commented on GitHub (Jan 21, 2025): ollama cli doesn't do embeddings. Use the API. ```console curl localhost:11434/api/embed -d "{\"model\":\"nomic-embed-text\",\"input\":\"why is the sky blue?\"}" ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5494