[PR #5683] [CLOSED] fix: solve network disruption during downloads, add OLLAMA_DOWNLOAD_CONN setting #22423

Closed
opened 2026-04-19 16:18:55 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/5683
Author: @supercurio
Created: 7/13/2024
Status: Closed

Base: mainHead: main


📝 Commits (2)

  • a93389f fix: solve network disruption during downloads, add OLLAMA_DOWNLOAD_CONN setting
  • 230a912 Update docs/faq.md

📊 Changes

4 files changed (+45 additions, -3 deletions)

View changed files

📝 cmd/cmd.go (+1 -0)
📝 docs/faq.md (+17 -1)
📝 envconfig/config.go (+21 -0)
📝 server/download.go (+6 -2)

📄 Description

The process of managing bandwidth for model downloads has been an ongoing journey.

  • Users reported difficulties when downloading model since January in issue #2006
  • The feature #2995 was reverted in March 2024

The situation left Ollama server with unsafe network concurrency defaults since, causing problems for many users and people sharing the same network, whether they realize Ollama is the origin of their troubles or not.
In the associated issue, users describe in length the problems caused and creative mitigations.
Fortunately, the root cause is simple: 64 concurrent connections, an extremely aggressive value guaranteed to challenge any network congestion algorithm, and the fix is straightforward: opting for 1 concurrent connection by default per model download.
This PR addresses the root cause while adding the ability to configure network concurrency for download if required, via the OLLAMA_DOWNLOAD_CONN setting.
This PR avoids on purpose any complex, ineffective or hard to configure workarounds, like dynamic concurrency adjustments or manual bandwidth limiting.

From the commit associated:
The Ollama server now downloads models using a single connection. This change addresses the root cause of issue #2006 by following best practices instead of relying on workarounds. Users have been reporting problems associated with model downloads since January 2024, describing issues such as "hogging the entire device", "reliably and repeatedly kills my connection", "freezes completely leaving no choice but to hard reset", "when I download models, everyone in the office gets a really slow internet", and "when downloading large models, it feels like my home network is being DDoSed."

The environment variable OLLAMA_DOWNLOAD_CONN can be set to control the number of concurrent connections with a maximum value of 64 (the previous default, an aggressive value - unsafe in some conditions). The new default value is 1, ensuring each Ollama download is given the same priority as other network activities.

An entry in the FAQ describes how to use OLLAMA_DOWNLOAD_CONN for different use cases. This patch comes with a safe and unproblematic default value.

Changes include updates to the envconfig/config.go, cmd/cmd.go, server/download.go, and docs/faq.md files.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/5683 **Author:** [@supercurio](https://github.com/supercurio) **Created:** 7/13/2024 **Status:** ❌ Closed **Base:** `main` ← **Head:** `main` --- ### 📝 Commits (2) - [`a93389f`](https://github.com/ollama/ollama/commit/a93389f0fdaf2e47ff0f6bcf72a9980ada3739f7) fix: solve network disruption during downloads, add OLLAMA_DOWNLOAD_CONN setting - [`230a912`](https://github.com/ollama/ollama/commit/230a91243f4c50e4a35f4e06b1d3a83465ecfdff) Update docs/faq.md ### 📊 Changes **4 files changed** (+45 additions, -3 deletions) <details> <summary>View changed files</summary> 📝 `cmd/cmd.go` (+1 -0) 📝 `docs/faq.md` (+17 -1) 📝 `envconfig/config.go` (+21 -0) 📝 `server/download.go` (+6 -2) </details> ### 📄 Description The process of managing bandwidth for model downloads has been an ongoing journey. - Users reported difficulties when downloading model since January in issue #2006 - The feature #2995 was reverted in March 2024 The situation left Ollama server with unsafe network concurrency defaults since, causing problems for many users and people sharing the same network, whether they realize Ollama is the origin of their troubles or not. In the associated issue, users describe in length the problems caused and creative mitigations. Fortunately, the root cause is simple: 64 concurrent connections, an extremely aggressive value guaranteed to challenge any network congestion algorithm, and the fix is straightforward: opting for 1 concurrent connection by default per model download. This PR addresses the root cause while adding the ability to configure network concurrency for download if required, via the `OLLAMA_DOWNLOAD_CONN` setting. This PR avoids on purpose any complex, ineffective or hard to configure workarounds, like dynamic concurrency adjustments or manual bandwidth limiting. From the commit associated: The Ollama server now downloads models using a single connection. This change addresses the root cause of issue #2006 by following best practices instead of relying on workarounds. Users have been reporting problems associated with model downloads since January 2024, describing issues such as "hogging the entire device", "reliably and repeatedly kills my connection", "freezes completely leaving no choice but to hard reset", "when I download models, everyone in the office gets a really slow internet", and "when downloading large models, it feels like my home network is being DDoSed." The environment variable `OLLAMA_DOWNLOAD_CONN` can be set to control the number of concurrent connections with a maximum value of 64 (the previous default, an aggressive value - unsafe in some conditions). The new default value is 1, ensuring each Ollama download is given the same priority as other network activities. An entry in the FAQ describes how to use `OLLAMA_DOWNLOAD_CONN` for different use cases. This patch comes with a safe and unproblematic default value. Changes include updates to the `envconfig/config.go`, `cmd/cmd.go`, `server/download.go`, and `docs/faq.md` files. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-19 16:18:55 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#22423