[GH-ISSUE #3744] Download the models with alternative tools #28067

Open
opened 2026-04-22 05:50:39 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @pepo-ec on GitHub (Apr 19, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3744

How can I download models with other tools like wget/curl and then import them to a local Ollama server?

When I download a model it takes up all the available bandwidth and I want to be able to control the bandwidth so that it takes longer but does not leave my LAN without connectivity

Originally created by @pepo-ec on GitHub (Apr 19, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3744 How can I download models with other tools like wget/curl and then import them to a local Ollama server? When I download a model **it takes up all the available bandwidth** and I want to be able to control the bandwidth so that it takes longer but does not leave my LAN without connectivity
GiteaMirror added the feature request label 2026-04-22 05:50:39 -05:00
Author
Owner

@jaeminSon commented on GitHub (Apr 25, 2024):

This feature would save my life! I want to install ollama on another machine that has no internet connection and runs on a different OS.

<!-- gh-comment-id:2076714577 --> @jaeminSon commented on GitHub (Apr 25, 2024): This feature would save my life! I want to install ollama on another machine that has no internet connection and runs on a different OS.
Author
Owner

@bmizerany commented on GitHub (May 1, 2024):

We do not currently have plans to support alternative tools, however, we are working on making changes to the upload/download components of Ollama to support better throttling/use of available bandwidth. While there is not a public timeline for these yet, we hope to land them as soon as they are ready.

<!-- gh-comment-id:2088849600 --> @bmizerany commented on GitHub (May 1, 2024): We do not currently have plans to support alternative tools, however, we are working on making changes to the upload/download components of Ollama to support better throttling/use of available bandwidth. While there is not a public timeline for these yet, we hope to land them as soon as they are ready.
Author
Owner

@amirrezaDev1378 commented on GitHub (Aug 6, 2024):

I really need this feature. My target machine does not have an internet connection.

<!-- gh-comment-id:2270605488 --> @amirrezaDev1378 commented on GitHub (Aug 6, 2024): I really need this feature. My target machine does not have an internet connection.
Author
Owner

@shameoff commented on GitHub (Nov 30, 2024):

@bmizerany could you share at least domains from where ollama downloads models with ollama pull command? I need to configure it on my VPN client to download it straightforward instead of using VPN connection

<!-- gh-comment-id:2508950024 --> @shameoff commented on GitHub (Nov 30, 2024): @bmizerany could you share at least domains from where ollama downloads models with `ollama pull` command? I need to configure it on my VPN client to download it straightforward instead of using VPN connection
Author
Owner

@amirrezaDev1378 commented on GitHub (Nov 30, 2024):

Hi @bmizeran,
I created a tool for this purpose:
https://github.com/amirrezaDev1378/ollama-model-direct-download

It provides direct links to download models and can also install them.

P.S.: This is the domain they use: registry.ollama.ai

<!-- gh-comment-id:2508995445 --> @amirrezaDev1378 commented on GitHub (Nov 30, 2024): Hi @bmizeran, I created a tool for this purpose: https://github.com/amirrezaDev1378/ollama-model-direct-download It provides direct links to download models and can also install them. P.S.: This is the domain they use: `registry.ollama.ai`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28067