[GH-ISSUE #7751] List of all available models #30708

Closed
opened 2026-04-22 10:36:40 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @vt-alt on GitHub (Nov 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7751

Please provide list of (or API to list) all models available on https://ollama.com/library with tags.
This would be useful for users to get them from cli without a browser.
Also, this would be useful for shell completions.

Originally created by @vt-alt on GitHub (Nov 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7751 Please provide list of (or API to list) all models available on https://ollama.com/library with tags. This would be useful for users to get them from cli without a browser. Also, this would be useful for shell completions.
GiteaMirror added the feature request label 2026-04-22 10:36:40 -05:00
Author
Owner

@vt-alt commented on GitHub (Nov 20, 2024):

I see that request is reappearing. Models list could be just a file without internet access packed on every release into docs.

<!-- gh-comment-id:2487056405 --> @vt-alt commented on GitHub (Nov 20, 2024): I see that request is reappearing. Models list could be just a file without internet access packed on every release into docs.
Author
Owner

@rick-github commented on GitHub (Nov 20, 2024):

Until the API is implemented, https://github.com/ollama/ollama/issues/7239#issuecomment-2423530279 has some code that scrapes the library.

<!-- gh-comment-id:2487139779 --> @rick-github commented on GitHub (Nov 20, 2024): Until the API is implemented, https://github.com/ollama/ollama/issues/7239#issuecomment-2423530279 has some code that scrapes the library.
Author
Owner

@kth8 commented on GitHub (Nov 20, 2024):

I've been using these 2 one liners in a justfile to quickly get models and tags

set positional-arguments := true

default:
  just --list

#list Ollama models
@models:
    curl -s https://ollama.com/library | grep -oP 'href="/library/\K[^"]+'

#list model tags
@tags model:
    curl -s https://ollama.com/library/$1/tags | grep -o "$1:[^\" ]*q[^\" ]*" | grep -E -v 'text|base|fp|q[45]_[01]'
$ just models
qwen2.5-coder
llama3.2-vision
llama3.2
llama3.1
llama3
mistral
...

$ just tags llama3.2
llama3.2:1b-instruct-q2_K
llama3.2:1b-instruct-q3_K_L
llama3.2:1b-instruct-q3_K_M
llama3.2:1b-instruct-q3_K_S
llama3.2:1b-instruct-q4_K_M
llama3.2:1b-instruct-q4_K_S
llama3.2:1b-instruct-q5_K_M
llama3.2:1b-instruct-q5_K_S
llama3.2:1b-instruct-q6_K
llama3.2:1b-instruct-q8_0
llama3.2:3b-instruct-q2_K
llama3.2:3b-instruct-q3_K_L
llama3.2:3b-instruct-q3_K_M
llama3.2:3b-instruct-q3_K_S
llama3.2:3b-instruct-q4_K_M
llama3.2:3b-instruct-q4_K_S
llama3.2:3b-instruct-q5_K_M
llama3.2:3b-instruct-q5_K_S
llama3.2:3b-instruct-q6_K
llama3.2:3b-instruct-q8_0
<!-- gh-comment-id:2488290937 --> @kth8 commented on GitHub (Nov 20, 2024): I've been using these 2 one liners in a [justfile](https://github.com/casey/just) to quickly get models and tags ``` set positional-arguments := true default: just --list #list Ollama models @models: curl -s https://ollama.com/library | grep -oP 'href="/library/\K[^"]+' #list model tags @tags model: curl -s https://ollama.com/library/$1/tags | grep -o "$1:[^\" ]*q[^\" ]*" | grep -E -v 'text|base|fp|q[45]_[01]' ``` ``` $ just models qwen2.5-coder llama3.2-vision llama3.2 llama3.1 llama3 mistral ... $ just tags llama3.2 llama3.2:1b-instruct-q2_K llama3.2:1b-instruct-q3_K_L llama3.2:1b-instruct-q3_K_M llama3.2:1b-instruct-q3_K_S llama3.2:1b-instruct-q4_K_M llama3.2:1b-instruct-q4_K_S llama3.2:1b-instruct-q5_K_M llama3.2:1b-instruct-q5_K_S llama3.2:1b-instruct-q6_K llama3.2:1b-instruct-q8_0 llama3.2:3b-instruct-q2_K llama3.2:3b-instruct-q3_K_L llama3.2:3b-instruct-q3_K_M llama3.2:3b-instruct-q3_K_S llama3.2:3b-instruct-q4_K_M llama3.2:3b-instruct-q4_K_S llama3.2:3b-instruct-q5_K_M llama3.2:3b-instruct-q5_K_S llama3.2:3b-instruct-q6_K llama3.2:3b-instruct-q8_0 ```
Author
Owner

@vt-alt commented on GitHub (Nov 20, 2024):

I've been using these 2 one liners in a justfile to quickly get models and tags

Thanks. Wow but it's so slow and definitely not suitable for bash completions as run-time http requests. But of course, list could be prepared in advance.

<!-- gh-comment-id:2489599707 --> @vt-alt commented on GitHub (Nov 20, 2024): > I've been using these 2 one liners in a [justfile](https://github.com/casey/just) to quickly get models and tags Thanks. Wow but it's so slow and definitely not suitable for bash completions as run-time http requests. But of course, list could be prepared in advance.
Author
Owner

@kth8 commented on GitHub (Nov 21, 2024):

Scraping the whole library takes 0.8s and tags take 0.3s from my benchmark

$ hyperfine "just models"
Benchmark 1: just models
  Time (mean ± σ):     813.1 ms ±  79.3 ms    [User: 51.7 ms, System: 14.7 ms]
  Range (min … max):   634.0 ms … 887.5 ms    10 runs

$ hyperfine "just tags llama3.2"
Benchmark 1: just tags llama3.2
  Time (mean ± σ):     285.3 ms ±  87.1 ms    [User: 50.9 ms, System: 14.9 ms]
  Range (min … max):   199.6 ms … 530.8 ms    11 runs

I'm not trying to use this for autocomplete, mainly used it for automated benchmarking

Ollama-MMLU-Pro $ just tags llama3.2 > models.txt
Ollama-MMLU-Pro $ while read line; do ollama pull $line; pipenv run python run_openai.py -u http://127.0.0.1:11434/v1 -p 3 --category other -m $line; done < models.txt
<!-- gh-comment-id:2489902070 --> @kth8 commented on GitHub (Nov 21, 2024): Scraping the whole library takes 0.8s and tags take 0.3s from my benchmark ``` $ hyperfine "just models" Benchmark 1: just models Time (mean ± σ): 813.1 ms ± 79.3 ms [User: 51.7 ms, System: 14.7 ms] Range (min … max): 634.0 ms … 887.5 ms 10 runs $ hyperfine "just tags llama3.2" Benchmark 1: just tags llama3.2 Time (mean ± σ): 285.3 ms ± 87.1 ms [User: 50.9 ms, System: 14.9 ms] Range (min … max): 199.6 ms … 530.8 ms 11 runs ``` I'm not trying to use this for autocomplete, mainly used it for automated benchmarking ``` Ollama-MMLU-Pro $ just tags llama3.2 > models.txt Ollama-MMLU-Pro $ while read line; do ollama pull $line; pipenv run python run_openai.py -u http://127.0.0.1:11434/v1 -p 3 --category other -m $line; done < models.txt ```
Author
Owner

@vt-alt commented on GitHub (Nov 21, 2024):

Yes now it's much faster, perhaps some glitch at the time I tested it. Thanks

<!-- gh-comment-id:2489913082 --> @vt-alt commented on GitHub (Nov 21, 2024): Yes now it's much faster, perhaps some glitch at the time I tested it. Thanks
Author
Owner

@dhiltgen commented on GitHub (Nov 21, 2024):

Looks like a dup of #286

<!-- gh-comment-id:2491844816 --> @dhiltgen commented on GitHub (Nov 21, 2024): Looks like a dup of #286
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#30708