[GH-ISSUE #5914] Alias names for models #50202

Closed
opened 2026-04-28 14:43:36 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @jpummill on GitHub (Jul 24, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5914

Model names are hard to remember. They can be very long and somewhat cryptic.

For example, I may have the following models on my system for testing:
mistral-nemo:12b-instruct-2407-q3_K_S
mistral-nemo:12b-instruct-2407-q4_K_S
mistral-nemo:12b-instruct-2407-q5_K_M

I think it would be helpful to be able to create alias names for models. For example, I might use the following aliases: mn12-3, mn12-4, and mn12-5.

We would still have the true name as a reference but we could also have a column in the list output for the alias name and Ollama would be able to use either for loading.

Originally created by @jpummill on GitHub (Jul 24, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5914 Model names are hard to remember. They can be very long and somewhat cryptic. For example, I may have the following models on my system for testing: mistral-nemo:12b-instruct-2407-q3_K_S mistral-nemo:12b-instruct-2407-q4_K_S mistral-nemo:12b-instruct-2407-q5_K_M I think it would be helpful to be able to create alias names for models. For example, I might use the following aliases: mn12-3, mn12-4, and mn12-5. We would still have the true name as a reference but we could also have a column in the list output for the alias name and Ollama would be able to use either for loading.
GiteaMirror added the feature request label 2026-04-28 14:43:36 -05:00
Author
Owner

@hlstudio commented on GitHub (Jul 24, 2024):

you can use: ollama cp mistral-nemo:12b-instruct-2407-q3_K_S mn12-3
then: ollama run mn12-3

<!-- gh-comment-id:2248167851 --> @hlstudio commented on GitHub (Jul 24, 2024): you can use: ollama cp mistral-nemo:12b-instruct-2407-q3_K_S mn12-3 then: ollama run mn12-3
Author
Owner

@rick-github commented on GitHub (Jul 24, 2024):

ollama cp mistral-nemo:12b-instruct-2407-q3_K_S mn12-3

The copy will have pointers to the data of the original, so it doesn't take any extra space (other than the manifest file). You can find the original by looking for the Id in the output of ollama list

<!-- gh-comment-id:2248168474 --> @rick-github commented on GitHub (Jul 24, 2024): ``` ollama cp mistral-nemo:12b-instruct-2407-q3_K_S mn12-3 ``` The copy will have pointers to the data of the original, so it doesn't take any extra space (other than the manifest file). You can find the original by looking for the `Id` in the output of `ollama list`
Author
Owner

@jpummill commented on GitHub (Jul 24, 2024):

Thank you @rick-github for the workaround but this will also have the effect of increasing the length of the list with additional models that aren't truly new or different. It will also force users to have to remove multiple models in order to actually remove the model files from their system.

<!-- gh-comment-id:2248184987 --> @jpummill commented on GitHub (Jul 24, 2024): Thank you @rick-github for the workaround but this will also have the effect of increasing the length of the list with additional models that aren't truly new or different. It will also force users to have to remove multiple models in order to actually remove the model files from their system.
Author
Owner

@lee-b commented on GitHub (Aug 26, 2024):

ollama cp mistral-nemo:12b-instruct-2407-q3_K_S mn12-3

The copy will have pointers to the data of the original, so it doesn't take any extra space (other than the manifest file). You can find the original by looking for the Id in the output of ollama list

I've been doing this, but it's proving quite unworkable as any edits to the model create a new model, which diverges, and there is no obvious way to trace it back even to bring them into sync with further manual edits since the model IDs/hashes are then different.

A true ln -s which just creates an alias that resolves to the real model would be very beneficial. This is an important workflow, to separate client configuration ("smart-model") from server configuration ("mistral-large" / "llama-3.1-405b" / etc.)

<!-- gh-comment-id:2310382300 --> @lee-b commented on GitHub (Aug 26, 2024): > ``` > ollama cp mistral-nemo:12b-instruct-2407-q3_K_S mn12-3 > ``` > > The copy will have pointers to the data of the original, so it doesn't take any extra space (other than the manifest file). You can find the original by looking for the `Id` in the output of `ollama list` I've been doing this, but it's proving quite unworkable as any edits to the model create a new model, which diverges, and there is no obvious way to trace it back even to bring them into sync with further manual edits since the model IDs/hashes are then different. A true `ln -s` which just creates an alias that resolves to the real model would be very beneficial. This is an important workflow, to separate client configuration ("smart-model") from server configuration ("mistral-large" / "llama-3.1-405b" / etc.)
Author
Owner

@jmorganca commented on GitHub (Sep 4, 2024):

Hi @jpummill thanks for the issue. I think the closest thing to this right now is ollama cp. Note: it does make essentially a new pointer to the same model, and so you can safely remove the old name. I realize re-downloading an updated version of the model would mean you have to re-run the ollama cp and ollama rm, but I'm not sure there's much else we could add here.

<!-- gh-comment-id:2327849745 --> @jmorganca commented on GitHub (Sep 4, 2024): Hi @jpummill thanks for the issue. I think the closest thing to this right now is `ollama cp`. Note: it does make essentially a new pointer to the same model, and so you can safely remove the old name. I realize re-downloading an updated version of the model would mean you have to re-run the `ollama cp` and `ollama rm`, but I'm not sure there's much else we could add here.
Author
Owner

@mmccubbing commented on GitHub (Feb 24, 2026):

For anybody else who comes across this, looking for a solution, I've just been using variables to hold the full name.

Eg:
April="ServiceNow-AI/Apriel-1.6-15b-Thinker:Q4_K_M"
# ollama run $April # same as: ollama run ServiceNow-AI/Apriel-1.6-15b-Thinker:Q4_K_M

<!-- gh-comment-id:3954612994 --> @mmccubbing commented on GitHub (Feb 24, 2026): For anybody else who comes across this, looking for a solution, I've just been using variables to hold the full name. Eg: April="ServiceNow-AI/Apriel-1.6-15b-Thinker:Q4_K_M" `# ollama run $April # same as: ollama run ServiceNow-AI/Apriel-1.6-15b-Thinker:Q4_K_M`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#50202