[GH-ISSUE #5141] Make "pull" support more than one model #65279

Closed
opened 2026-05-03 20:19:41 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Speedway1 on GitHub (Jun 19, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5141

"ollama pull " currently only supports one parameter. However when setting up a new server, or when do a bulk update of LLMs, we need to do a batch of LLM pulls.

It would be very handy for the command to support more than one model as parameter.

E.g.
ollama pull deepseek-coder-v2 phi3:14b codestral

As opposed to:
for i in deepseek-coder-v2 phi3:14b codestral
do
ollama pull $i
done

It also means that the job can be given a nohup and booted into background and for longer downloads it can simply run as a background task until all the models are pulled.

Originally created by @Speedway1 on GitHub (Jun 19, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5141 "ollama pull " currently only supports one parameter. However when setting up a new server, or when do a bulk update of LLMs, we need to do a batch of LLM pulls. It would be very handy for the command to support more than one model as parameter. E.g. ollama pull deepseek-coder-v2 phi3:14b codestral As opposed to: for i in deepseek-coder-v2 phi3:14b codestral do ollama pull $i done It also means that the job can be given a nohup and booted into background and for longer downloads it can simply run as a background task until all the models are pulled.
GiteaMirror added the feature request label 2026-05-03 20:19:41 -05:00
Author
Owner

@chrisoutwright commented on GitHub (Jun 25, 2024):

On slower internet connections, I often couldn't complete even one download...even on high-speed cable, I would occasionally get stuck near the end (will become very slow for the last percentages)
I'm curious about how the partial files will be managed during multiple simultaneous pulls and what if one would fail (partial gone then?)? Additionally, in Windows 10, a single pull completely blocks bandwidth; I wouldn't even be able to open a site properly due to many simultaneous connections for each pull (possibly due to the federated fetch mode).
Do we have improvements to those challenges? This would help the feature I guess.

<!-- gh-comment-id:2188657776 --> @chrisoutwright commented on GitHub (Jun 25, 2024): On slower internet connections, I often couldn't complete even one download...even on high-speed cable, I would occasionally get stuck near the end (will become very slow for the last percentages) I'm curious about how the partial files will be managed during multiple simultaneous pulls and what if one would fail (partial gone then?)? Additionally, in Windows 10, a single pull completely blocks bandwidth; I wouldn't even be able to open a site properly due to many simultaneous connections for each pull (possibly due to the federated fetch mode). Do we have improvements to those challenges? This would help the feature I guess.
Author
Owner

@dhiltgen commented on GitHub (Sep 24, 2024):

Just noticed we have another issue tracking this #4351

<!-- gh-comment-id:2371670163 --> @dhiltgen commented on GitHub (Sep 24, 2024): Just noticed we have another issue tracking this #4351
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#65279