[GH-ISSUE #8541] I should not have to write the full model name #5509

Closed
opened 2026-04-12 16:45:22 -05:00 by GiteaMirror · 18 comments
Owner

Originally created by @RustoMCSpit on GitHub (Jan 22, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8541

If I want to run mistral, and mistral is the only model I have starting with an "m", i should just have to type "ollama run m"
if theres something called "mestral", then i can type "ollama run mi", and so on

Originally created by @RustoMCSpit on GitHub (Jan 22, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8541 If I want to run mistral, and mistral is the only model I have starting with an "m", i should just have to type "ollama run m" if theres something called "mestral", then i can type "ollama run mi", and so on
GiteaMirror added the feature request label 2026-04-12 16:45:22 -05:00
Author
Owner

@rick-github commented on GitHub (Jan 23, 2025):

ollama cp mistral m
<!-- gh-comment-id:2608520405 --> @rick-github commented on GitHub (Jan 23, 2025): ``` ollama cp mistral m ```
Author
Owner

@RustoMCSpit commented on GitHub (Jan 23, 2025):

ollama cp mistral m

on principle, people shouldnt have to do this. it's QoL thing

<!-- gh-comment-id:2608588186 --> @RustoMCSpit commented on GitHub (Jan 23, 2025): > ``` > ollama cp mistral m > ``` on principle, people shouldnt have to do this. it's QoL thing
Author
Owner

@rick-github commented on GitHub (Jan 23, 2025):

If you use a non-windows system you can try tab completion, https://github.com/ollama/ollama/issues/7239

<!-- gh-comment-id:2608601143 --> @rick-github commented on GitHub (Jan 23, 2025): If you use a non-windows system you can try tab completion, https://github.com/ollama/ollama/issues/7239
Author
Owner

@RustoMCSpit commented on GitHub (Jan 23, 2025):

If you use a non-windows system you can try tab completion, #7239

doesnt work for me, linux mint btw

<!-- gh-comment-id:2608770370 --> @RustoMCSpit commented on GitHub (Jan 23, 2025): > If you use a non-windows system you can try tab completion, [#7239](https://github.com/ollama/ollama/issues/7239) doesnt work for me, linux mint btw
Author
Owner

@hey-august commented on GitHub (Feb 1, 2025):

Tab completion does not work on macOS either.

Lack of autocomplete is an annoying oversight.

<!-- gh-comment-id:2628632276 --> @hey-august commented on GitHub (Feb 1, 2025): Tab completion does not work on macOS either. Lack of autocomplete is an annoying oversight.
Author
Owner

@rick-github commented on GitHub (Feb 1, 2025):

macOS uses zsh, right?

<!-- gh-comment-id:2628637202 --> @rick-github commented on GitHub (Feb 1, 2025): macOS uses zsh, right?
Author
Owner

@hey-august commented on GitHub (Feb 1, 2025):

@rick-github yes, default shell since 2019.

<!-- gh-comment-id:2628642089 --> @hey-august commented on GitHub (Feb 1, 2025): @rick-github yes, default shell since 2019.
Author
Owner

@rick-github commented on GitHub (Feb 1, 2025):

OK, I'll look at making a more portable version.

<!-- gh-comment-id:2628643675 --> @rick-github commented on GitHub (Feb 1, 2025): OK, I'll look at making a more portable version.
Author
Owner

@hey-august commented on GitHub (Feb 1, 2025):

Thank you!

I see someone has configured fish shell to do tab complete in #4444. this could be done in the zsh config as well. But I do think it would be a good QOL improvement to implement this behavior in ollama itself.

Btw, this FR appears to be a duplicate of #925.

<!-- gh-comment-id:2628648044 --> @hey-august commented on GitHub (Feb 1, 2025): Thank you! I see someone has configured fish shell to do tab complete in #4444. this could be done in the zsh config as well. But I do think it would be a good QOL improvement to implement this behavior in ollama itself. Btw, this FR appears to be a duplicate of #925.
Author
Owner

@RustoMCSpit commented on GitHub (Mar 19, 2025):

@hey-august i dont think it's a duplicate, i dont even want to have to press tab.

<!-- gh-comment-id:2736021778 --> @RustoMCSpit commented on GitHub (Mar 19, 2025): @hey-august i dont think it's a duplicate, i dont even want to have to press tab.
Author
Owner

@RustoMCSpit commented on GitHub (Mar 19, 2025):

OK, I'll look at making a more portable version.

does zsh work on linux?

<!-- gh-comment-id:2736022798 --> @RustoMCSpit commented on GitHub (Mar 19, 2025): > OK, I'll look at making a more portable version. does zsh work on linux?
Author
Owner

@rick-github commented on GitHub (Mar 19, 2025):

i dont think it's a duplicate, i dont even want to have to press tab.

#!/bin/bash

die(){
  echo "$*" >&2
  exit 1
}

_=$(command -v jq) || die "Need jq"
_=$(command -v curl) || die "Need curl"

OLLAMA_HOST=${OLLAMA_HOST:-localhost:11434}

model=$1 ; shift

list=($(curl -s $OLLAMA_HOST/api/tags | jq -r '.models[].name' | grep "^$model"))

[ ${#list[*]} -eq 0 ] && die "No model '$model'"

MODEL=0
[ ${#list[*]} -gt 1 ] && {
  i=0; for m in ${list[*]} ; do printf "[%2d] $m\n" $[i++] ; done

  while
    read -p 'Select model: ' MODEL || { echo ; exit 0 ; }
    ! _=$(expr match "$MODEL" "[0-9][0-9]*$") || [ "$MODEL" -ge ${#list[*]} ]
    do
      :
  done
}

ollama run ${list[$MODEL]} $*

does zsh work on linux?

Yes.

<!-- gh-comment-id:2736074250 --> @rick-github commented on GitHub (Mar 19, 2025): > i dont think it's a duplicate, i dont even want to have to press tab. ```sh #!/bin/bash die(){ echo "$*" >&2 exit 1 } _=$(command -v jq) || die "Need jq" _=$(command -v curl) || die "Need curl" OLLAMA_HOST=${OLLAMA_HOST:-localhost:11434} model=$1 ; shift list=($(curl -s $OLLAMA_HOST/api/tags | jq -r '.models[].name' | grep "^$model")) [ ${#list[*]} -eq 0 ] && die "No model '$model'" MODEL=0 [ ${#list[*]} -gt 1 ] && { i=0; for m in ${list[*]} ; do printf "[%2d] $m\n" $[i++] ; done while read -p 'Select model: ' MODEL || { echo ; exit 0 ; } ! _=$(expr match "$MODEL" "[0-9][0-9]*$") || [ "$MODEL" -ge ${#list[*]} ] do : done } ollama run ${list[$MODEL]} $* ``` > does zsh work on linux? Yes.
Author
Owner

@hey-august commented on GitHub (Mar 20, 2025):

@rick-github this is cool. I was able to set it up in Zsh. It didn't work for me at first, ran it through AI and it recommended replacing expr with regex.

This version is working great for me:

#!/bin/bash

die(){
  echo "$*" >&2
  exit 1
}

_=$(command -v jq) || die "Need jq"
_=$(command -v curl) || die "Need curl"

OLLAMA_HOST=${OLLAMA_HOST:-localhost:11434}

model=$1 ; shift

list=($(curl -s $OLLAMA_HOST/api/tags | jq -r '.models[].name' | grep "^$model"))

[ ${#list[*]} -eq 0 ] && die "No model '$model'"

MODEL=0
[ ${#list[*]} -gt 1 ] && {
  i=0; for m in "${list[@]}" ; do printf "[%2d] %s\n" $i "$m" ; i=$((i+1)) ; done

  while
    read -p 'Select model: ' MODEL || { echo ; exit 0 ; }
    ! [[ "$MODEL" =~ ^[0-9]+$ ]] || [ "$MODEL" -ge ${#list[*]} ]
  do
    echo "Invalid selection. Please enter a number between 0 and $((${#list[*]} - 1))."
  done
}

ollama run "${list[$MODEL]}" "$@"

I then aliased ol to this script in my .zshrc.

Behavior

Running ol with a string that only matches the start of one model name directly runs that model. In this case, "de" only matches deepseek-r1:14b.

~ % ol de
>>> Send a message (/? for help)

Run by itself or with a string matching multiple models, it instead generates a numbered list. When one is selected by inputting the relevant number, that model is run.

~ % ol
[ 0] ds:latest
[ 1] qwenc:latest
[ 2] qwen2.5-coder:14b
[ 3] qwen2:latest
[ 4] deepseek-r1:14b
Select model: 4
>>> Send a message (/? for help)
<!-- gh-comment-id:2741857986 --> @hey-august commented on GitHub (Mar 20, 2025): @rick-github this is cool. I was able to set it up in Zsh. It didn't work for me at first, ran it through AI and it recommended replacing `expr` with regex. This version is working great for me: ```bash #!/bin/bash die(){ echo "$*" >&2 exit 1 } _=$(command -v jq) || die "Need jq" _=$(command -v curl) || die "Need curl" OLLAMA_HOST=${OLLAMA_HOST:-localhost:11434} model=$1 ; shift list=($(curl -s $OLLAMA_HOST/api/tags | jq -r '.models[].name' | grep "^$model")) [ ${#list[*]} -eq 0 ] && die "No model '$model'" MODEL=0 [ ${#list[*]} -gt 1 ] && { i=0; for m in "${list[@]}" ; do printf "[%2d] %s\n" $i "$m" ; i=$((i+1)) ; done while read -p 'Select model: ' MODEL || { echo ; exit 0 ; } ! [[ "$MODEL" =~ ^[0-9]+$ ]] || [ "$MODEL" -ge ${#list[*]} ] do echo "Invalid selection. Please enter a number between 0 and $((${#list[*]} - 1))." done } ollama run "${list[$MODEL]}" "$@" ``` I then aliased `ol` to this script in my `.zshrc`. ## Behavior Running `ol` with a string that only matches the start of one model name directly runs that model. In this case, "de" only matches `deepseek-r1:14b`. ```bash ~ % ol de >>> Send a message (/? for help) ``` Run by itself or with a string matching multiple models, it instead generates a numbered list. When one is selected by inputting the relevant number, that model is run. ```zsh ~ % ol [ 0] ds:latest [ 1] qwenc:latest [ 2] qwen2.5-coder:14b [ 3] qwen2:latest [ 4] deepseek-r1:14b Select model: 4 >>> Send a message (/? for help) ```
Author
Owner

@hey-august commented on GitHub (Mar 25, 2025):

Check out the ergonomics of @coder543 's fish shell completion. Super nice tab hinting and completion.

Image

<!-- gh-comment-id:2749840172 --> @hey-august commented on GitHub (Mar 25, 2025): Check out the ergonomics of @coder543 's fish shell completion. Super nice tab hinting and completion. ![Image](https://github.com/user-attachments/assets/ef4c764e-e867-45ba-afd5-95a8a98a2ded)
Author
Owner

@jmorganca commented on GitHub (Apr 14, 2025):

Hey all, thanks for the issue, and I love the creativity in solving this with shell scripts. Ollama's API currently expects complete model names (vs prefixes) for simplicity, and while we'd love to make shorthands like this work, there's a lot of other functionality in the API we'd want to support first and I don't think we'd get to this any time soon.

<!-- gh-comment-id:2802836461 --> @jmorganca commented on GitHub (Apr 14, 2025): Hey all, thanks for the issue, and I love the creativity in solving this with shell scripts. Ollama's API currently expects complete model names (vs prefixes) for simplicity, and while we'd love to make shorthands like this work, there's a lot of other functionality in the API we'd want to support first and I don't think we'd get to this any time soon.
Author
Owner

@RustoMCSpit commented on GitHub (Apr 14, 2025):

Hey all, thanks for the issue, and I love the creativity in solving this with shell scripts. Ollama's API currently expects complete model names (vs prefixes) for simplicity, and while we'd love to make shorthands like this work, there's a lot of other functionality in the API we'd want to support first and I don't think we'd get to this any time soon.

why close this then? just tag it for 'do later'. also thanks for all you do

<!-- gh-comment-id:2802930204 --> @RustoMCSpit commented on GitHub (Apr 14, 2025): > Hey all, thanks for the issue, and I love the creativity in solving this with shell scripts. Ollama's API currently expects complete model names (vs prefixes) for simplicity, and while we'd love to make shorthands like this work, there's a lot of other functionality in the API we'd want to support first and I don't think we'd get to this any time soon. why close this then? just tag it for 'do later'. also thanks for all you do
Author
Owner

@RustoMCSpit commented on GitHub (May 10, 2025):

Check out the ergonomics of @coder543 's fish shell completion. Super nice tab hinting and completion.

Image Image

where is the code for this? looks great

<!-- gh-comment-id:2869214459 --> @RustoMCSpit commented on GitHub (May 10, 2025): > Check out the ergonomics of [@coder543](https://github.com/coder543) 's fish shell completion. Super nice tab hinting and completion. > > ![Image](https://github.com/user-attachments/assets/ef4c764e-e867-45ba-afd5-95a8a98a2ded) [ ![Image](https://github.com/user-attachments/assets/ef4c764e-e867-45ba-afd5-95a8a98a2ded) ](https://private-user-images.githubusercontent.com/112662403/426322230-ef4c764e-e867-45ba-afd5-95a8a98a2ded.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NDY5MTk0MDUsIm5iZiI6MTc0NjkxOTEwNSwicGF0aCI6Ii8xMTI2NjI0MDMvNDI2MzIyMjMwLWVmNGM3NjRlLWU4NjctNDViYS1hZmQ1LTk1YThhOThhMmRlZC5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjUwNTEwJTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI1MDUxMFQyMzE4MjVaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT00ZjI5MDhkMzUwOTVkYmYxNjExMzQ1MmZhNjIzYjQzZmFiZWYyNWZmYmRlZGQ1OTc1N2NkMzE0YjIyOGM2MWE2JlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCJ9.EuKM7S5VwmSIzbfdtBigijoKk920hSv98CRyNAQma_I) [ ](https://private-user-images.githubusercontent.com/112662403/426322230-ef4c764e-e867-45ba-afd5-95a8a98a2ded.gif?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NDY5MTk0MDUsIm5iZiI6MTc0NjkxOTEwNSwicGF0aCI6Ii8xMTI2NjI0MDMvNDI2MzIyMjMwLWVmNGM3NjRlLWU4NjctNDViYS1hZmQ1LTk1YThhOThhMmRlZC5naWY_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjUwNTEwJTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI1MDUxMFQyMzE4MjVaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT00ZjI5MDhkMzUwOTVkYmYxNjExMzQ1MmZhNjIzYjQzZmFiZWYyNWZmYmRlZGQ1OTc1N2NkMzE0YjIyOGM2MWE2JlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCJ9.EuKM7S5VwmSIzbfdtBigijoKk920hSv98CRyNAQma_I) where is the code for this? looks great
Author
Owner

@coder543 commented on GitHub (May 10, 2025):

@RustoMCSpit it’s here: https://github.com/ollama/ollama/issues/4444

<!-- gh-comment-id:2869215091 --> @coder543 commented on GitHub (May 10, 2025): @RustoMCSpit it’s here: https://github.com/ollama/ollama/issues/4444
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5509