[GH-ISSUE #7909] ollama run quentinz/bge-large-zh-v1.5:latest Error: "quentinz/bge-large-zh-v1.5:latest" does not support generate #51572

Closed
opened 2026-04-28 20:35:44 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @cqray1990 on GitHub (Dec 2, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7909

What is the issue?

ollama pull quentinz/bge-large-zh-v1.5

when start quentinz/bge-large-zh-v1.5:latest,raise errors

ollama run quentinz/bge-large-zh-v1.5:latest
Error: "quentinz/bge-large-zh-v1.5:latest" does not support generate

OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @cqray1990 on GitHub (Dec 2, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7909 ### What is the issue? ollama pull quentinz/bge-large-zh-v1.5 when start quentinz/bge-large-zh-v1.5:latest,raise errors ollama run quentinz/bge-large-zh-v1.5:latest Error: "quentinz/bge-large-zh-v1.5:latest" does not support generate ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-28 20:35:44 -05:00
Author
Owner

@rick-github commented on GitHub (Dec 2, 2024):

quentinz/bge-large-zh-v1.5 is an embedding model, it can only do embeddings.

$ curl -s localhost:11434/api/embed -d '{"model":"quentinz/bge-large-zh-v1.5","input":"Why is the sky blue?"}' | jq '.embeddings=[.embeddings[]|length]'
{
  "model": "quentinz/bge-large-zh-v1.5",
  "embeddings": [
    1024
  ],
  "total_duration": 525616181,
  "load_duration": 396202187,
  "prompt_eval_count": 6
}
$ curl -s localhost:11434/api/generate -d '{"model":"quentinz/bge-large-zh-v1.5","prompt":"Why is the sky blue?"}' | jq
{
  "error": "\"quentinz/bge-large-zh-v1.5\" does not support generate"
}
<!-- gh-comment-id:2511378970 --> @rick-github commented on GitHub (Dec 2, 2024): `quentinz/bge-large-zh-v1.5` is an embedding model, it can only do embeddings. ```console $ curl -s localhost:11434/api/embed -d '{"model":"quentinz/bge-large-zh-v1.5","input":"Why is the sky blue?"}' | jq '.embeddings=[.embeddings[]|length]' { "model": "quentinz/bge-large-zh-v1.5", "embeddings": [ 1024 ], "total_duration": 525616181, "load_duration": 396202187, "prompt_eval_count": 6 } ``` ```console $ curl -s localhost:11434/api/generate -d '{"model":"quentinz/bge-large-zh-v1.5","prompt":"Why is the sky blue?"}' | jq { "error": "\"quentinz/bge-large-zh-v1.5\" does not support generate" } ```
Author
Owner

@cqray1990 commented on GitHub (Dec 2, 2024):

i want to use embeding models ,but this can't be generate by ollama run cmd

---- Replied Message ----
| From | @.> |
| Date | 12/02/2024 20:15 |
| To | @.
> |
| Cc | @.>、State @.> |
| Subject | Re: [ollama/ollama] ollama run quentinz/bge-large-zh-v1.5:latest Error: "quentinz/bge-large-zh-v1.5:latest" does not support generate (Issue #7909) |

quentinz/bge-large-zh-v1.5 is an embedding model, it can only do embeddings.

$ curl -s localhost:11434/api/embed -d '{"model":"quentinz/bge-large-zh-v1.5","input":"Why is the sky blue?"}'| jq '.embeddings=[.embeddings[]|length]'{ "model": "quentinz/bge-large-zh-v1.5", "embeddings": [ 1024 ], "total_duration": 525616181, "load_duration": 396202187, "prompt_eval_count": 6}
$ curl -s localhost:11434/api/generate -d '{"model":"quentinz/bge-large-zh-v1.5","prompt":"Why is the sky blue?"}'| jq{ "error": ""quentinz/bge-large-zh-v1.5" does not support generate"}


Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you modified the open/close state.Message ID: @.***>

<!-- gh-comment-id:2511393816 --> @cqray1990 commented on GitHub (Dec 2, 2024): i want to use embeding models ,but this can't be generate by ollama run cmd ---- Replied Message ---- | From | ***@***.***> | | Date | 12/02/2024 20:15 | | To | ***@***.***> | | Cc | ***@***.***>、State ***@***.***> | | Subject | Re: [ollama/ollama] ollama run quentinz/bge-large-zh-v1.5:latest Error: "quentinz/bge-large-zh-v1.5:latest" does not support generate (Issue #7909) | quentinz/bge-large-zh-v1.5 is an embedding model, it can only do embeddings. $ curl -s localhost:11434/api/embed -d '{"model":"quentinz/bge-large-zh-v1.5","input":"Why is the sky blue?"}'| jq '.embeddings=[.embeddings[]|length]'{ "model": "quentinz/bge-large-zh-v1.5", "embeddings": [ 1024 ], "total_duration": 525616181, "load_duration": 396202187, "prompt_eval_count": 6} $ curl -s localhost:11434/api/generate -d '{"model":"quentinz/bge-large-zh-v1.5","prompt":"Why is the sky blue?"}'| jq{ "error": "\"quentinz/bge-large-zh-v1.5\" does not support generate"} — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you modified the open/close state.Message ID: ***@***.***>
Author
Owner

@rick-github commented on GitHub (Dec 2, 2024):

There is no embed command in the ollama CLI. You can simulate it with a script that calls the API.

#!/usr/bin/env python3

import ollama
import sys

print(ollama.Client().embed(model=sys.argv[1], input=" ".join(sys.argv[2:]))["embeddings"])
$ ./ollama-embed.py quentinz/bge-large-zh-v1.5 Why is the sky blue?
[[0.028106524,0.02807588,...,-0.009205404]]

This is just an example, if you wanted to do more advanced things like multiple embeddings you'd need to modify it accordingly.

<!-- gh-comment-id:2511529954 --> @rick-github commented on GitHub (Dec 2, 2024): There is no embed command in the ollama CLI. You can simulate it with a script that calls the API. ```python #!/usr/bin/env python3 import ollama import sys print(ollama.Client().embed(model=sys.argv[1], input=" ".join(sys.argv[2:]))["embeddings"]) ``` ```console $ ./ollama-embed.py quentinz/bge-large-zh-v1.5 Why is the sky blue? [[0.028106524,0.02807588,...,-0.009205404]] ``` This is just an example, if you wanted to do more advanced things like multiple embeddings you'd need to modify it accordingly.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#51572