[GH-ISSUE #1328] pull models by ID for reproducible research #47202

Open
opened 2026-04-28 03:25:43 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @leonmoonen on GitHub (Nov 30, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1328

I love how ollama makes it easy to set up scientific experiments with various LLMs. However, one thing that is increasingly important in academia is the ability to reproduce and replicate research. For that, it would be extremely useful to pull a given model by its ID, even if a newer one has been published.

For example: last month I ran an experiment with then zephyr:latest (ID: 1629f2a8a495), but now my colleague can not reproduce my results because zephyr has been updated and zephyr:latest points to a model with ID 03af36d860cc. As far as I know, there is no tag associated with the older model and it can therefore not be pulled. Treating IDs as additional tags could solve this and make ollama a valuable component in a reproducible scientific workflow.

Originally created by @leonmoonen on GitHub (Nov 30, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1328 I love how ollama makes it easy to set up scientific experiments with various LLMs. However, one thing that is increasingly important in academia is the ability to reproduce and replicate research. For that, it would be extremely useful to pull a given model by its ID, even if a newer one has been published. For example: last month I ran an experiment with then zephyr:latest (ID: 1629f2a8a495), but now my colleague can not reproduce my results because zephyr has been updated and zephyr:latest points to a model with ID 03af36d860cc. As far as I know, there is no tag associated with the older model and it can therefore not be pulled. Treating IDs as additional tags could solve this and make ollama a valuable component in a reproducible scientific workflow.
GiteaMirror added the feature request label 2026-04-28 03:25:43 -05:00
Author
Owner

@igorschlum commented on GitHub (Nov 30, 2023):

When you use a LLM for an experiment, you can do it with docker and keep both the current Ollama version and the LLM version to reproduce the experiment. Be careful that some new versions of Ollama require new versions of LLM. This makes older LLM incompatible with new Ollama (seen with Falcon LLM).

<!-- gh-comment-id:1834685026 --> @igorschlum commented on GitHub (Nov 30, 2023): When you use a LLM for an experiment, you can do it with docker and keep both the current Ollama version and the LLM version to reproduce the experiment. Be careful that some new versions of Ollama require new versions of LLM. This makes older LLM incompatible with new Ollama (seen with Falcon LLM).
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#47202