[GH-ISSUE #5169] How do I find the model version in Ollama? #29018

Open
opened 2026-04-22 07:36:28 -05:00 by GiteaMirror · 20 comments
Owner

Originally created by @qzc438 on GitHub (Jun 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5169

As the title described. How do I get the model version if I download a model from Ollama? On which day is the model being updated?

Originally created by @qzc438 on GitHub (Jun 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5169 As the title described. How do I get the model version if I download a model from Ollama? On which day is the model being updated?
GiteaMirror added the feature request label 2026-04-22 07:36:28 -05:00
Author
Owner

@qzc438 commented on GitHub (Jun 23, 2024):

I mean the original version of the model. For example, where was the model downloaded?

<!-- gh-comment-id:2185006811 --> @qzc438 commented on GitHub (Jun 23, 2024): I mean the original version of the model. For example, where was the model downloaded?
Author
Owner

@d-kleine commented on GitHub (Jun 23, 2024):

ollama list
or
ollama show Your_Model_Name, e.g. ollama show llama3

<!-- gh-comment-id:2185019954 --> @d-kleine commented on GitHub (Jun 23, 2024): `ollama list` or `ollama show Your_Model_Name`, e.g. `ollama show llama3`
Author
Owner

@qzc438 commented on GitHub (Jun 24, 2024):

Thank you. It is working now. But is it possible for me to see when the llama3 was last updated?

<!-- gh-comment-id:2185468473 --> @qzc438 commented on GitHub (Jun 24, 2024): Thank you. It is working now. But is it possible for me to see when the llama3 was last updated?
Author
Owner

@d-kleine commented on GitHub (Jun 24, 2024):

Afaik you need to check the model page: https://ollama.com/library/llama3
There is also an ollama model updater
With ollama show llama3 you can find out your model version and update it with ollama pull llama3.

<!-- gh-comment-id:2186125118 --> @d-kleine commented on GitHub (Jun 24, 2024): Afaik you need to check the model page: https://ollama.com/library/llama3 There is also an [ollama model updater](https://github.com/technovangelist/ollamamodelupdater) With `ollama show llama3` you can find out your model version and update it with `ollama pull llama3`.
Author
Owner

@qzc438 commented on GitHub (Jun 24, 2024):

I see. My question is more related to version control. For example, how can I know which llama3 I am using (there will be a date tag like 2024-06-24)?

<!-- gh-comment-id:2186184930 --> @qzc438 commented on GitHub (Jun 24, 2024): I see. My question is more related to version control. For example, how can I know which llama3 I am using (there will be a date tag like 2024-06-24)?
Author
Owner

@d-kleine commented on GitHub (Jun 24, 2024):

Again, with ollama list, there is an ID for each model. It also shows when it has been modified the last time. You can then compare it with the model page library from above.

There are no date tags as this is quite unusual in git, typically you use release tags, as you have here: llama3:latest

<!-- gh-comment-id:2186201009 --> @d-kleine commented on GitHub (Jun 24, 2024): Again, with `ollama list`, there is an ID for each model. It also shows when it has been modified the last time. You can then compare it with the model page library from above. There are no date tags as this is quite unusual in git, typically you use release tags, as you have here: `llama3:latest`
Author
Owner

@qzc438 commented on GitHub (Jun 24, 2024):

Thank you. I can see Ollama is using the git ID to control the version. This makes sense to me.

  1. A more specific question: Is it possible to download an older version of the model using the Git ID? For example, can I use ollama pull llama3:70b:786f3184aec0?
  2. I cannot see the older version history on this page: https://ollama.com/library/llama3:70b
<!-- gh-comment-id:2186234928 --> @qzc438 commented on GitHub (Jun 24, 2024): Thank you. I can see Ollama is using the git ID to control the version. This makes sense to me. 1. A more specific question: Is it possible to download an older version of the model using the Git ID? For example, can I use `ollama pull llama3:70b:786f3184aec0`? 2. I cannot see the older version history on this page: https://ollama.com/library/llama3:70b
Author
Owner

@d-kleine commented on GitHub (Jun 24, 2024):

Llama3 is a bad example in this case because it's too new for now. Take this one: https://ollama.com/library/openhermes
You can see in the tag drop-down menu that there is v2.0, v2.5, etc. You can pull this via ollama pull openhermes:v2.

If your question is about a commit of a v2.0 model, afaik no. But LLMs don't get updated often without a new versions tag. Only way you could do this is by going on the HF page using a commit checkout of the old version of model and exporting it to ollama.

I hope that answers your questions. You can find answers to all of your questions in the internet, for example here: https://www.reddit.com/r/ollama/comments/1cc1fpm/how_do_i_know_which_exact_version_is_being_used/

<!-- gh-comment-id:2186289719 --> @d-kleine commented on GitHub (Jun 24, 2024): Llama3 is a bad example in this case because it's too new for now. Take this one: https://ollama.com/library/openhermes You can see in the tag drop-down menu that there is v2.0, v2.5, etc. You can pull this via `ollama pull openhermes:v2`. If your question is about a commit of a v2.0 model, afaik no. But LLMs don't get updated often without a new versions tag. Only way you could do this is by going on the HF page using a commit checkout of the old version of model and exporting it to ollama. I hope that answers your questions. You can find answers to all of your questions in the internet, for example here: https://www.reddit.com/r/ollama/comments/1cc1fpm/how_do_i_know_which_exact_version_is_being_used/
Author
Owner

@qzc438 commented on GitHub (Jun 24, 2024):

Thank you. I get it.

<!-- gh-comment-id:2186332485 --> @qzc438 commented on GitHub (Jun 24, 2024): Thank you. I get it.
Author
Owner

@pharrellyhy commented on GitHub (Jul 11, 2024):

i updated 3.8b-mini-128k-instruct-q5_K_M (ID: 76b1fd52d644) today and it turns out it works quite bad for my use cases. the previous version i use is ID: 5a696b4e6899. is it possible to get that version back and how?

<!-- gh-comment-id:2222432060 --> @pharrellyhy commented on GitHub (Jul 11, 2024): i updated 3.8b-mini-128k-instruct-q5_K_M (ID: 76b1fd52d644) today and it turns out it works quite bad for my use cases. the previous version i use is ID: 5a696b4e6899. is it possible to get that version back and how?
Author
Owner

@d-kleine commented on GitHub (Jul 12, 2024):

I assume this is phi3. I don't think you can do that with Ollama (maybe @jmorganca knows this, I could imagine this could be relevant for more users).
https://ollama.com/library/phi3:3.8b-mini-128k-instruct-q5_K_M

But actually the non-quantized models are versioned in HF: https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/tree/main

<!-- gh-comment-id:2224800243 --> @d-kleine commented on GitHub (Jul 12, 2024): I assume this is phi3. I don't think you can do that with Ollama (maybe @jmorganca knows this, I could imagine this could be relevant for more users). https://ollama.com/library/phi3:3.8b-mini-128k-instruct-q5_K_M But actually the non-quantized models are versioned in HF: https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/tree/main
Author
Owner

@pharrellyhy commented on GitHub (Jul 12, 2024):

Screenshot 2024-07-12 at 2 11 03 PM
yeah, i mean i previously used the phi3-3.8b-mini-128k-instruct-q5_K_M with ID: 5a696b4e6899. maybe download 5 weeks ago. you can see from the image. i updated phi3-3.8b-mini-128k-instruct-q5_K_M yesterday which ID is 76b1fd52d644. the question is is there a way to get the previous version back?

<!-- gh-comment-id:2224837641 --> @pharrellyhy commented on GitHub (Jul 12, 2024): ![Screenshot 2024-07-12 at 2 11 03 PM](https://github.com/user-attachments/assets/f72c5fd9-b68f-490e-a717-5cb1a6d35191) yeah, i mean i previously used the phi3-3.8b-mini-128k-instruct-q5_K_M with ID: 5a696b4e6899. maybe download 5 weeks ago. you can see from the image. i updated phi3-3.8b-mini-128k-instruct-q5_K_M yesterday which ID is 76b1fd52d644. the question is is there a way to get the previous version back?
Author
Owner

@d-kleine commented on GitHub (Jul 12, 2024):

You could try

ollama run phi3:3.8b-mini-128k-instruct-q5_K_M@76b1fd52d644

I cannot test it as it don't have this ID downloaded. But I don't think this works.

<!-- gh-comment-id:2224873388 --> @d-kleine commented on GitHub (Jul 12, 2024): You could try ``` ollama run phi3:3.8b-mini-128k-instruct-q5_K_M@76b1fd52d644 ``` I cannot test it as it don't have this ID downloaded. But I don't think this works.
Author
Owner

@pharrellyhy commented on GitHub (Jul 12, 2024):

it's not working, complaining for both version {"error":"model 'phi3:3.8b-mini-128k-instruct-q5_K_M:76b1fd52d644' not found, try pulling it first"} {"error":"model 'phi3:3.8b-mini-128k-instruct-q5_K_M:5a696b4e6899' not found, try pulling it first"}. maybe ollama doesn't support this.

<!-- gh-comment-id:2224889498 --> @pharrellyhy commented on GitHub (Jul 12, 2024): it's not working, complaining for both version `{"error":"model 'phi3:3.8b-mini-128k-instruct-q5_K_M:76b1fd52d644' not found, try pulling it first"}` `{"error":"model 'phi3:3.8b-mini-128k-instruct-q5_K_M:5a696b4e6899' not found, try pulling it first"}`. maybe ollama doesn't support this.
Author
Owner

@d-kleine commented on GitHub (Jul 12, 2024):

maybe ollama doesn't support this.

I think so, yes. Maybe open a separate issue on this, I think this question will be relevant for some other users too (I also don't like that you cannot see the commit history for each model in the Ollama models library)

If you need this specific model urgently, you could try to download the non-quantized Phi-3-mini-128k-instruct model from HF from a previous checkpoint (see commit history: https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/commits/main) and quantize it with Ollama via llama.cpp to q5_K_M by yourself
https://github.com/ollama/ollama/blob/main/docs/import.md

<!-- gh-comment-id:2224912131 --> @d-kleine commented on GitHub (Jul 12, 2024): > maybe ollama doesn't support this. I think so, yes. Maybe open a separate issue on this, I think this question will be relevant for some other users too (I also don't like that you cannot see the commit history for each model in the Ollama models library) If you need this specific model urgently, you could try to download the non-quantized Phi-3-mini-128k-instruct model from HF from a previous checkpoint (see commit history: https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/commits/main) and quantize it with Ollama via llama.cpp to q5_K_M by yourself https://github.com/ollama/ollama/blob/main/docs/import.md
Author
Owner

@qzc438 commented on GitHub (Jul 12, 2024):

After reviewing the contents, I decided to reopen this for further discussion. Providing a function for the model version is very important, as there may be differences between different model versions. At least users may need a tag to indicate which version of the model they are using.

<!-- gh-comment-id:2225006500 --> @qzc438 commented on GitHub (Jul 12, 2024): After reviewing the contents, I decided to reopen this for further discussion. Providing a function for the model version is very important, as there may be differences between different model versions. At least users may need a tag to indicate which version of the model they are using.
Author
Owner

@qzc438 commented on GitHub (Jul 19, 2024):

Hi, does anyone have any updates on this feature?

<!-- gh-comment-id:2238483183 --> @qzc438 commented on GitHub (Jul 19, 2024): Hi, does anyone have any updates on this feature?
Author
Owner

@KansaiTraining commented on GitHub (Aug 26, 2024):

ollama show does not show the versions of the model it is showing....
Also it requiress --modelfile to work

<!-- gh-comment-id:2309162846 --> @KansaiTraining commented on GitHub (Aug 26, 2024): ollama show does not show the versions of the model it is showing.... Also it requiress --modelfile to work
Author
Owner

@yanorei32 commented on GitHub (Jan 17, 2025):

I am experiencing the same issue.
I want to pull a model using the Ollama CLI on the CI. At that time, I would like to pin the model version with a digest. However, I can't do so.

There is a comment in the source code that suggests pinning with digest is possible:
021817e59a/types/model/name.go (L96-L112)

However, it does not appear to be implemented at the moment.

Are there any plans to implement the feature to pin with a digest?

<!-- gh-comment-id:2597815836 --> @yanorei32 commented on GitHub (Jan 17, 2025): I am experiencing the same issue. I want to pull a model using the Ollama CLI on the CI. At that time, I would like to pin the model version with a digest. However, I can't do so. There is a comment in the source code that suggests pinning with digest is possible: https://github.com/ollama/ollama/blob/021817e59ace5e351b35b2e6881f83a09f038546/types/model/name.go#L96-L112 However, it does not appear to be implemented at the moment. Are there any plans to implement the feature to pin with a digest?
Author
Owner

@KansaiTraining commented on GitHub (Feb 7, 2025):

I want to see the version of my llava moddel in ollama. Ollama list only shows me "latest" and ollama shows does not show the version

<!-- gh-comment-id:2642056637 --> @KansaiTraining commented on GitHub (Feb 7, 2025): I want to see the version of my llava moddel in ollama. Ollama list only shows me "latest" and ollama shows does not show the version
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29018