[GH-ISSUE #4034] Implement downloads via torrents #64541

Open
opened 2026-05-03 18:01:22 -05:00 by GiteaMirror · 9 comments
Owner

Originally created by @f321x on GitHub (Apr 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4034

Model downloads with a slow (10mbit) internet connection are really unreliable and crash around all 5-10gb for me (EOF max retries).
At the same time huge torrents work very reliable.
If you could implement a call to a external torrent client for model downloading or implement a torrent client the download experience would be more reliable, way faster and you could save costs on hosting the files.

This library could be used to implement this:
https://github.com/anacrolix/torrent

Or simply create a subprocess call to transmission-cli (more hacky way)

Originally created by @f321x on GitHub (Apr 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4034 Model downloads with a slow (10mbit) internet connection are really unreliable and crash around all 5-10gb for me (EOF max retries). At the same time huge torrents work very reliable. If you could implement a call to a external torrent client for model downloading or implement a torrent client the download experience would be more reliable, way faster and you could save costs on hosting the files. This library could be used to implement this: https://github.com/anacrolix/torrent Or simply create a subprocess call to transmission-cli (more hacky way)
GiteaMirror added the registryfeature request labels 2026-05-03 18:01:22 -05:00
Author
Owner

@bmizerany commented on GitHub (May 1, 2024):

Hi! We're actively working on fixing issues with regard to slow downloads. We'll continue to improve as we release new versions of Ollama. Thank you for the ticket!

<!-- gh-comment-id:2088827786 --> @bmizerany commented on GitHub (May 1, 2024): Hi! We're actively working on fixing issues with regard to slow downloads. We'll continue to improve as we release new versions of Ollama. Thank you for the ticket!
Author
Owner

@TomiWebPro commented on GitHub (May 13, 2024):

Yes! I signed in just to comment on this one, it would be really great to have all users distribute these models like a torrent network than from a center server which is more costly and slower. I really look forward for this to be implemented on ollama.

<!-- gh-comment-id:2108997451 --> @TomiWebPro commented on GitHub (May 13, 2024): Yes! I signed in just to comment on this one, it would be really great to have all users distribute these models like a torrent network than from a center server which is more costly and slower. I really look forward for this to be implemented on ollama.
Author
Owner

@purificant commented on GitHub (Jul 31, 2024):

👍 for torrents, they can handle large file distribution use cases more gracefully, handling connection drops, pausing, resuming, checksums / verification, etc.
Especially relevant for some of the larger models, for example llama3.1:405b is a 231 GB download.

<!-- gh-comment-id:2260376679 --> @purificant commented on GitHub (Jul 31, 2024): :+1: for torrents, they can handle large file distribution use cases more gracefully, handling connection drops, pausing, resuming, checksums / verification, etc. Especially relevant for some of the larger models, for example `llama3.1:405b` is a 231 GB download.
Author
Owner

@attentionmech commented on GitHub (Sep 16, 2024):

Torrents please!

<!-- gh-comment-id:2353327533 --> @attentionmech commented on GitHub (Sep 16, 2024): Torrents please!
Author
Owner

@trymeouteh commented on GitHub (Nov 14, 2024):

I made a suggestion awhile back for the bittorrent protocol to impliment ways to downloads and seed torrents and use the downloads in applications on the system

https://github.com/bittorrent/bittorrent.org/issues/150

https://github.com/bittorrent/bittorrent.org/issues/159

I would like to see a torrent option and the ability to self host your own LLM repository.

https://github.com/ollama/ollama/issues/2841

<!-- gh-comment-id:2477570277 --> @trymeouteh commented on GitHub (Nov 14, 2024): I made a suggestion awhile back for the bittorrent protocol to impliment ways to downloads and seed torrents and use the downloads in applications on the system https://github.com/bittorrent/bittorrent.org/issues/150 https://github.com/bittorrent/bittorrent.org/issues/159 I would like to see a torrent option and the ability to self host your own LLM repository. https://github.com/ollama/ollama/issues/2841
Author
Owner

@Modzho commented on GitHub (Nov 29, 2025):

Yes, we need a torrent option. One disconnect-reconnect and the entire download starts from the beginning. Madness!

<!-- gh-comment-id:3591523840 --> @Modzho commented on GitHub (Nov 29, 2025): Yes, we need a torrent option. One disconnect-reconnect and the entire download starts from the beginning. Madness!
Author
Owner

@davidjimenez75 commented on GitHub (Apr 4, 2026):

Supporting models torrent downloads and USB-based installation would save thousands of terabytes of traffic and energy, and bring Ollama to remote, offline users.

<!-- gh-comment-id:4187016982 --> @davidjimenez75 commented on GitHub (Apr 4, 2026): Supporting models torrent downloads and USB-based installation would save thousands of terabytes of traffic and energy, and bring Ollama to remote, offline users.
Author
Owner

@ppetr commented on GitHub (Apr 23, 2026):

I also up-vote a torrent-based download. With SHA1 or the new SHA256 (BitTorrent v2) it's just as secure, and would be very reliable.

<!-- gh-comment-id:4307352111 --> @ppetr commented on GitHub (Apr 23, 2026): I also up-vote a torrent-based download. With SHA1 or the new SHA256 (BitTorrent v2) it's just as secure, and would be very reliable.
Author
Owner

@panda667 commented on GitHub (Apr 24, 2026):

Torrents please!!

<!-- gh-comment-id:4310947888 --> @panda667 commented on GitHub (Apr 24, 2026): Torrents please!!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64541