[GH-ISSUE #3117] Api /tags should include type for embedding model or llm #27677

Closed
opened 2026-04-22 05:12:43 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @Hansson0728 on GitHub (Mar 13, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3117

As the title says, it would be nice to have that information so we can filter out embedd models if we want to allow for model switching on a frontend

Originally created by @Hansson0728 on GitHub (Mar 13, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3117 As the title says, it would be nice to have that information so we can filter out embedd models if we want to allow for model switching on a frontend
GiteaMirror added the feature requestapi labels 2026-04-22 05:12:44 -05:00
Author
Owner

@amila-ku commented on GitHub (Mar 15, 2024):

@Hansson0728 would you mind sharing an examples that show how you expect to see it in the response payload. Example response is here

<!-- gh-comment-id:1999650589 --> @amila-ku commented on GitHub (Mar 15, 2024): @Hansson0728 would you mind sharing an examples that show how you expect to see it in the response payload. Example response is [here](https://github.com/ollama/ollama/blob/main/docs/api.md#examples-5)
Author
Owner

@Hansson0728 commented on GitHub (Mar 18, 2024):

i would say somehing like:

`    {
      "name": "codellama:13b",
      "modified_at": "2023-11-04T14:56:49.277302595-07:00",
      "size": 7365960935,
      "digest": "9f438cb9cd581fc025612d27f7c1a6669ff83a8bb0ed86c94fcf4c5440555697",
      "details": {
        "format": "gguf",
        "family": "llama",
        "families": null,
        "Type": 'LLM' ('Embedding')
        "parameter_size": "13B",
        "quantization_level": "Q4_0"
      }
    },`
<!-- gh-comment-id:2003129704 --> @Hansson0728 commented on GitHub (Mar 18, 2024): i would say somehing like: ``` ` { "name": "codellama:13b", "modified_at": "2023-11-04T14:56:49.277302595-07:00", "size": 7365960935, "digest": "9f438cb9cd581fc025612d27f7c1a6669ff83a8bb0ed86c94fcf4c5440555697", "details": { "format": "gguf", "family": "llama", "families": null, "Type": 'LLM' ('Embedding') "parameter_size": "13B", "quantization_level": "Q4_0" } },` ```
Author
Owner

@it-s commented on GitHub (Jun 5, 2024):

i would say somehing like:

`    {
      "name": "codellama:13b",
      "modified_at": "2023-11-04T14:56:49.277302595-07:00",
      "size": 7365960935,
      "digest": "9f438cb9cd581fc025612d27f7c1a6669ff83a8bb0ed86c94fcf4c5440555697",
      "details": {
        "format": "gguf",
        "family": "llama",
        "families": null,
        "Type": 'LLM' ('Embedding')
        "parameter_size": "13B",
        "quantization_level": "Q4_0"
      }
    },`

I would suggest to also include a "capabilities" property as a list of tags that would describe what a model is capable of:

"capabilities": ["code-completion", "image-processing", "embedding", "story-telling", etc]

Since ollama now includes image-capable models such as llava UI should know if users are allowed to send images to a model, or not
https://github.com/ollama/ollama/issues/4835

<!-- gh-comment-id:2150152893 --> @it-s commented on GitHub (Jun 5, 2024): > i would say somehing like: > > ``` > ` { > "name": "codellama:13b", > "modified_at": "2023-11-04T14:56:49.277302595-07:00", > "size": 7365960935, > "digest": "9f438cb9cd581fc025612d27f7c1a6669ff83a8bb0ed86c94fcf4c5440555697", > "details": { > "format": "gguf", > "family": "llama", > "families": null, > "Type": 'LLM' ('Embedding') > "parameter_size": "13B", > "quantization_level": "Q4_0" > } > },` > ``` I would suggest to also include a "capabilities" property as a list of tags that would describe what a model is capable of: ``` "capabilities": ["code-completion", "image-processing", "embedding", "story-telling", etc] ``` Since ollama now includes image-capable models such as llava UI should know if users are allowed to send images to a model, or not https://github.com/ollama/ollama/issues/4835
Author
Owner

@it-s commented on GitHub (Jun 5, 2024):

Also I see that this is already done i the latest released version of ollama. So, this request can be closed :D

<!-- gh-comment-id:2150186823 --> @it-s commented on GitHub (Jun 5, 2024): Also I see that this is already done i the latest released version of ollama. So, this request can be closed :D
Author
Owner

@FreshLucas-git commented on GitHub (Jun 17, 2024):

Also I see that this is already done i the latest released version of ollama. So, this request can be closed :D

@it-s Hi. Could you tell me which version of Ollama has already implemented this feature? I didn't see this feature implemented in the latest version of Ollama.

<!-- gh-comment-id:2172115878 --> @FreshLucas-git commented on GitHub (Jun 17, 2024): > Also I see that this is already done i the latest released version of ollama. So, this request can be closed :D @it-s Hi. Could you tell me which version of Ollama has already implemented this feature? I didn't see this feature implemented in the latest version of Ollama.
Author
Owner

@rick-github commented on GitHub (Feb 2, 2026):

$ curl -s localhost:11434/api/show -d '{"model":"codellama:13b"}' | jq .capabilities
[
  "completion"
]
$ curl -s localhost:11434/api/show -d '{"model":"nomic-embed-text"}' | jq .capabilities
[
  "embedding"
]
$ curl -s localhost:11434/api/show -d '{"model":"qwen3-vl"}' | jq .capabilities
[
  "completion",
  "vision",
  "tools",
  "thinking"
]
<!-- gh-comment-id:3837709004 --> @rick-github commented on GitHub (Feb 2, 2026): ```console $ curl -s localhost:11434/api/show -d '{"model":"codellama:13b"}' | jq .capabilities [ "completion" ] $ curl -s localhost:11434/api/show -d '{"model":"nomic-embed-text"}' | jq .capabilities [ "embedding" ] $ curl -s localhost:11434/api/show -d '{"model":"qwen3-vl"}' | jq .capabilities [ "completion", "vision", "tools", "thinking" ] ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27677