[GH-ISSUE #12202] More information about the model in the local api #70176

Closed
opened 2026-05-04 20:36:35 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @EliasPereirah on GitHub (Sep 6, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12202

http://localhost:11434/api/tags returns something like:

{
  "models": [
    {
      "name": "embeddinggemma:latest",
      "model": "embeddinggemma:latest",
      "modified_at": "2025-09-04T15:55:30.937224963-03:00",
      "size": 621867480,
      "digest": "693ca723e5e742b76da1a20ab96b70c49dacbbd20c22969d2d98c3be46bfb17d",
      "details": {
        "parent_model": "",
        "format": "gguf",
        "family": "gemma3",
        "families": [
          "gemma3"
        ],
        "parameter_size": "307.58M",
        "quantization_level": "BF16"
      }
    },
    {
      "name": "gemma3n:e2b",
      "model": "gemma3n:e2b",
      "modified_at": "2025-06-26T18:07:14.431922815-03:00",
      "size": 5621616562,
      "digest": "719372f8c7deee188821a4dcbaf75efa13a342d7e88a79d4fc2412b24947f6fd",
      "details": {
        "parent_model": "",
        "format": "gguf",
        "family": "gemma3n",
        "families": [
          "gemma3n"
        ],
        "parameter_size": "4.5B",
        "quantization_level": "Q4_K_M"
      }
    },

The problem is that there is no distinction between a generative AI models and an embeddings models.

More details about the model would also be interesting, information on whether it supports "tools", "vision" and "thinking"

Originally created by @EliasPereirah on GitHub (Sep 6, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12202 http://localhost:11434/api/tags returns something like: ```javascript { "models": [ { "name": "embeddinggemma:latest", "model": "embeddinggemma:latest", "modified_at": "2025-09-04T15:55:30.937224963-03:00", "size": 621867480, "digest": "693ca723e5e742b76da1a20ab96b70c49dacbbd20c22969d2d98c3be46bfb17d", "details": { "parent_model": "", "format": "gguf", "family": "gemma3", "families": [ "gemma3" ], "parameter_size": "307.58M", "quantization_level": "BF16" } }, { "name": "gemma3n:e2b", "model": "gemma3n:e2b", "modified_at": "2025-06-26T18:07:14.431922815-03:00", "size": 5621616562, "digest": "719372f8c7deee188821a4dcbaf75efa13a342d7e88a79d4fc2412b24947f6fd", "details": { "parent_model": "", "format": "gguf", "family": "gemma3n", "families": [ "gemma3n" ], "parameter_size": "4.5B", "quantization_level": "Q4_K_M" } }, ``` The problem is that there is no distinction between a generative AI models and an embeddings models. More details about the model would also be interesting, information on whether it supports "tools", "vision" and "thinking"
GiteaMirror added the feature request label 2026-05-04 20:36:35 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 6, 2025):

$ curl -s localhost:11434/api/show -d '{"model":"embeddinggemma:latest"}' | jq .capabilities
[
  "embedding"
]
$ curl -s localhost:11434/api/show -d '{"model":"gemma3n:e2b"}' | jq .capabilities
[
  "completion"
]
$ curl -s localhost:11434/api/show -d '{"model":"llama4"}' | jq .capabilities
[
  "completion",
  "vision",
  "tools"
]
$ curl -s localhost:11434/api/show -d '{"model":"qwen3"}' | jq .capabilities
[
  "completion",
  "tools",
  "thinking"
]
<!-- gh-comment-id:3262261370 --> @rick-github commented on GitHub (Sep 6, 2025): ```console $ curl -s localhost:11434/api/show -d '{"model":"embeddinggemma:latest"}' | jq .capabilities [ "embedding" ] $ curl -s localhost:11434/api/show -d '{"model":"gemma3n:e2b"}' | jq .capabilities [ "completion" ] $ curl -s localhost:11434/api/show -d '{"model":"llama4"}' | jq .capabilities [ "completion", "vision", "tools" ] $ curl -s localhost:11434/api/show -d '{"model":"qwen3"}' | jq .capabilities [ "completion", "tools", "thinking" ] ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#70176