[GH-ISSUE #2155] Unable to push: max retries exceeded on slower connections #1229

Open
opened 2026-04-12 11:00:08 -05:00 by GiteaMirror · 15 comments
Owner

Originally created by @sqs on GitHub (Jan 23, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2155

Originally assigned to: @mxyng on GitHub.

I was able to push the q4_0 tag to https://ollama.ai/sqs/starchat, but when I try to push other tags, I am getting an error (see below). Note the %!F(MISSING) below in case that is an issue.

The file size of the one that failed is 7.7GB. The q4_0 push that succeeded was 8.4 GB.

$ for i in q3_K_M q4_K_M q5_K_S q5_K_M f16 f32; do ollama create sqs/starchat:beta-$i -f Modelfile.$i && ollama push sqs/starchat:beta-$i; done
transferring model data 
creating model layer 
creating template layer 
using already created layer sha256:62b0be00997dd300b03868d7858d28f41488c0222bfc4fbb6ceb3eae39a5d4d7 
using already created layer sha256:ca40f7f0151766210faa524fa8710aabf07284671aaac525eeac350d64d05132 
using already created layer sha256:dd473af9080c0674443f41cb6feb59ac1e24c34f18255c78d083f138f3275a0c 
writing manifest 
success 
retrieving manifest 
pushing 62b0be00997d...   0% ▕                                                                ▏ 1.3 MB/8.2 GB  5.2 MB/s  26m34s
Error: max retries exceeded: Put "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/repositories/sqs/starchat/_uploads/55c91d69-edf4-4a50-a278-2c7c697ba4e4/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=XXX%!F(MISSING)20240123%!F(MISSING)auto%!F(MISSING)s3%!F(MISSING)aws4_request&X-Amz-Date=20240123T072755Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&partNumber=29&uploadId=XXX&X-Amz-Signature=XXX": write tcp 192.168.2.154:51301->104.18.9.90:443: write: broken pipe

(Note: I replaced URL query params that may contain credentials with XXX.)

This may just be an ephemeral error. I'll close this tomorrow if the other pushes succeeded.

Originally created by @sqs on GitHub (Jan 23, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2155 Originally assigned to: @mxyng on GitHub. I was able to push the `q4_0` tag to https://ollama.ai/sqs/starchat, but when I try to push other tags, I am getting an error (see below). Note the `%!F(MISSING)` below in case that is an issue. The file size of the one that failed is 7.7GB. The `q4_0` push that succeeded was 8.4 GB. ``` $ for i in q3_K_M q4_K_M q5_K_S q5_K_M f16 f32; do ollama create sqs/starchat:beta-$i -f Modelfile.$i && ollama push sqs/starchat:beta-$i; done transferring model data creating model layer creating template layer using already created layer sha256:62b0be00997dd300b03868d7858d28f41488c0222bfc4fbb6ceb3eae39a5d4d7 using already created layer sha256:ca40f7f0151766210faa524fa8710aabf07284671aaac525eeac350d64d05132 using already created layer sha256:dd473af9080c0674443f41cb6feb59ac1e24c34f18255c78d083f138f3275a0c writing manifest success retrieving manifest pushing 62b0be00997d... 0% ▕ ▏ 1.3 MB/8.2 GB 5.2 MB/s 26m34s Error: max retries exceeded: Put "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/repositories/sqs/starchat/_uploads/55c91d69-edf4-4a50-a278-2c7c697ba4e4/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=XXX%!F(MISSING)20240123%!F(MISSING)auto%!F(MISSING)s3%!F(MISSING)aws4_request&X-Amz-Date=20240123T072755Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&partNumber=29&uploadId=XXX&X-Amz-Signature=XXX": write tcp 192.168.2.154:51301->104.18.9.90:443: write: broken pipe ``` (Note: I replaced URL query params that may contain credentials with `XXX`.) This may just be an ephemeral error. I'll close this tomorrow if the other pushes succeeded.
GiteaMirror added the networkingbug labels 2026-04-12 11:00:08 -05:00
Author
Owner

@sqs commented on GitHub (Jan 23, 2024):

Got a different error when trying to push sqs/starcoder:beta-q4_K_M:

pushing 3708ce083ec6...   0% ▕                                                               ▏ 1.0 MB/10.0 GB                  
Error: max retries exceeded: http status 502 Bad Gateway: <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal connectivity issue. Please try again.</Message></Error>

And I also got the same error around the same time when trying to push the :beta-q3_K_M tag again:

$ ollama push sqs/starchat:beta-q3_K_M
retrieving manifest 
pushing 62b0be00997d...   0% ▕                                                                ▏ 1.0 MB/8.2 GB                  
Error: max retries exceeded: http status 502 Bad Gateway: <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal connectivity issue. Please try again.</Message></Error>
<!-- gh-comment-id:1905498707 --> @sqs commented on GitHub (Jan 23, 2024): Got a different error when trying to push `sqs/starcoder:beta-q4_K_M`: ``` pushing 3708ce083ec6... 0% ▕ ▏ 1.0 MB/10.0 GB Error: max retries exceeded: http status 502 Bad Gateway: <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal connectivity issue. Please try again.</Message></Error> ``` And I also got the same error around the same time when trying to push the `:beta-q3_K_M` tag again: ``` $ ollama push sqs/starchat:beta-q3_K_M retrieving manifest pushing 62b0be00997d... 0% ▕ ▏ 1.0 MB/8.2 GB Error: max retries exceeded: http status 502 Bad Gateway: <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal connectivity issue. Please try again.</Message></Error> ```
Author
Owner

@sqs commented on GitHub (Jan 23, 2024):

The ollama serve logs have some more information. I see:

<title>Worker exceeded resource limits | dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com | Cloudflare</title>

...

<h2 class="cf-subheadline" data-translate="error_desc">Worker exceeded resource limits</h2>

...

<p>You've requested a page on a website (dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com) that is on the <a href="https://www.cloudflare.com/5xx-error-landing/" target="_blank">Cloudflare</a> network. An unknown error occurred while rendering the page.</p>
<!-- gh-comment-id:1905513803 --> @sqs commented on GitHub (Jan 23, 2024): The `ollama serve` logs have some more information. I see: ``` <title>Worker exceeded resource limits | dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com | Cloudflare</title> ... <h2 class="cf-subheadline" data-translate="error_desc">Worker exceeded resource limits</h2> ... <p>You've requested a page on a website (dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com) that is on the <a href="https://www.cloudflare.com/5xx-error-landing/" target="_blank">Cloudflare</a> network. An unknown error occurred while rendering the page.</p> ```
Author
Owner

@sqs commented on GitHub (Jan 23, 2024):

Yeah, I was only able to upload that first q4_0 one. The others all failed for the reasons given above.

<!-- gh-comment-id:1906405096 --> @sqs commented on GitHub (Jan 23, 2024): Yeah, I was only able to upload that first `q4_0` one. The others all failed for the reasons given above.
Author
Owner

@sqs commented on GitHub (Jan 23, 2024):

On faster WiFi (thanks, Replicate!), the uploads are working. Maybe it is because less total transfer time means less likelihood it hits an ephemeral error or hits a worker time limit.

<!-- gh-comment-id:1906582066 --> @sqs commented on GitHub (Jan 23, 2024): On faster WiFi (thanks, Replicate!), the uploads are working. Maybe it is because less total transfer time means less likelihood it hits an ephemeral error or hits a worker time limit.
Author
Owner

@jmorganca commented on GitHub (Jan 23, 2024):

If it's okay ill leave this open so we can hunt down why it fails on slower connections 😊

<!-- gh-comment-id:1906592669 --> @jmorganca commented on GitHub (Jan 23, 2024): If it's okay ill leave this open so we can hunt down why it fails on slower connections 😊
Author
Owner

@sqs commented on GitHub (Jan 23, 2024):

Home wifi - 5-10MB/s upload. Replicate wifi (where it worked) - ~75-90MB/s upload.

<!-- gh-comment-id:1906594034 --> @sqs commented on GitHub (Jan 23, 2024): Home wifi - 5-10MB/s upload. Replicate wifi (where it worked) - ~75-90MB/s upload.
Author
Owner

@olafgeibig commented on GitHub (Feb 22, 2024):

I encountered most probably the same issue: https://github.com/ollama/ollama/issues/2094 I could work around it by using a VPN although that was even a bit slower then. I used the Google One VPN.

<!-- gh-comment-id:1959111875 --> @olafgeibig commented on GitHub (Feb 22, 2024): I encountered most probably the same issue: https://github.com/ollama/ollama/issues/2094 I could work around it by using a VPN although that was even a bit slower then. I used the Google One VPN.
Author
Owner

@tosh commented on GitHub (Apr 3, 2024):

I also just ran into this on a slow connection. Unable to complete the push.

<!-- gh-comment-id:2035584068 --> @tosh commented on GitHub (Apr 3, 2024): I also just ran into this on a slow connection. Unable to complete the push.
Author
Owner

@BramVanroy commented on GitHub (Apr 5, 2024):

Can confirm that I am also experiencing this issue on a 2.4MBps upload connection.

<!-- gh-comment-id:2039673264 --> @BramVanroy commented on GitHub (Apr 5, 2024): Can confirm that I am also experiencing this issue on a 2.4MBps upload connection.
Author
Owner

@joadataarg commented on GitHub (Apr 23, 2024):

Experimentando el error con carga de 1.5

<!-- gh-comment-id:2071257118 --> @joadataarg commented on GitHub (Apr 23, 2024): Experimentando el error con carga de 1.5
Author
Owner

@tarbard commented on GitHub (Apr 26, 2024):

I'm getting this constantly. I've only been able to upload 2-300 MB before it errors. My upload speed is 600Mbps.

Usually it's "read: connection reset by peer" but I also sometimes got
http status 502 Bad Gateway: <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal connectivity issue. Please try again.</Message></Error>

Is there a way to get it to try a different storage node, in case the one i'm connected to (dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com in this case) is faulty?

<!-- gh-comment-id:2079626879 --> @tarbard commented on GitHub (Apr 26, 2024): I'm getting this constantly. I've only been able to upload 2-300 MB before it errors. My upload speed is 600Mbps. Usually it's "read: connection reset by peer" but I also sometimes got `http status 502 Bad Gateway: <?xml version="1.0" encoding="UTF-8"?><Error><Code>InternalError</Code><Message>We encountered an internal connectivity issue. Please try again.</Message></Error>` Is there a way to get it to try a different storage node, in case the one i'm connected to (dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com in this case) is faulty?
Author
Owner

@electricalgorithm commented on GitHub (Apr 28, 2024):

Same here. With an upload speed up to 5MB/s, I can't push anything.

<!-- gh-comment-id:2081378437 --> @electricalgorithm commented on GitHub (Apr 28, 2024): Same here. With an upload speed up to 5MB/s, I can't push anything.
Author
Owner

@tarbard commented on GitHub (May 12, 2024):

I've tried:

  • changing my DNS, tried 8.8.8.8 and 1.1.1.1
  • deleting my pub/priv key and using the newly generated one
  • recreating model from gguf
  • using new modelname
  • using model name without any punctuation

no luck.

I still get connection reset by peer or bad gateway after 2 or 300MB uploaded.

<!-- gh-comment-id:2106205202 --> @tarbard commented on GitHub (May 12, 2024): I've tried: - changing my DNS, tried 8.8.8.8 and 1.1.1.1 - deleting my pub/priv key and using the newly generated one - recreating model from gguf - using new modelname - using model name without any punctuation no luck. I still get connection reset by peer or bad gateway after 2 or 300MB uploaded.
Author
Owner

@Aminadaven commented on GitHub (May 14, 2024):

the only way i succseeded in uploading even small gguf file (less than 3 gb) was in a really fast network. in my network, even when the upload speed is ok (about 2 mb per second) it always fails. it looks like it restarting the connection and every time tries to reupload from scratch.

<!-- gh-comment-id:2109761615 --> @Aminadaven commented on GitHub (May 14, 2024): the only way i succseeded in uploading even small gguf file (less than 3 gb) was in a really fast network. in my network, even when the upload speed is ok (about 2 mb per second) it always fails. it looks like it restarting the connection and every time tries to reupload from scratch.
Author
Owner

@mmounirf commented on GitHub (Apr 5, 2025):

constantly getting connection resets, with ollama serve showing the following log

time=2025-04-05T17:34:36.128+02:00 level=INFO source=upload.go:269 msg="26194f6a49bb part 6 attempt 0 failed: Put \"https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/repositories/ollama/ollama/_uploads/c325bd23-12f0-46b2-a63c-a1e12b41e04f/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20250405%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20250405T152857Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&partNumber=7&uploadId=ANo4CVhk8pgLoHlbimEc1NK7-eBiICHlPSZVlA89VfPk6hb88c6FD76ObBukUZGID2AyfDClH3NxL1GW-NCOEIbcWkMM7GC1oxHppN3PajzCUVDWE__zc3lPy2X68fZZK5UZ5yVpDPhSxxWSN2py8Pfe48NmRd79FGlXGgW0qGSKqz1lq_GGnTG7Z4LGdkHiVHz2M7JnmUZf_UWypJbL95_9EUYAz4mYFpkBCeniE2RxJYESTcKyA0wvIMRVZ4RIblS24rfTA0U57MvJHt7axsUjQrU1z7BmmsOQLMUFOQ9iTE7hMabRnZ4P-FFW9b7aVajpMkXKjJbW4V2diGUOBVQ&X-Amz-Signature=42168a2df69415ab6ad00042392f241fb440ba69cd967e8c68d8e7144ec14ba6\": write tcp [2a02:908:1312:9480:18a6:242b:2d71:2c7]:50166->[2a06:98c1:58::12e]:443: use of closed network connection, retrying in 1s"

until finally halts with

Error: max retries exceeded: Put "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/repositories/ollama/ollama/_uploads/c325bd23-12f0-46b2-a63c-a1e12b41e04f/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20250405%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20250405T152904Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&partNumber=12&uploadId=AONRve-Hmp6wbjRF6NqcNA7beUmX5lDJ8FMrjuKoqQ6MzyZp3Tg5PuLHbK1V4WplQ6LdJI2GUNHPnhnWTPdmObwYcbFV86gZg2_Ymyg9GEP7eYxXEC7GrlaTlsictoM3NPj2a6yaYH8BJWyWI8ItPuNMom51OkogNQTK5gRDCSsI9Q1Mv9v56QAyg3FLx5_3zpYmhIdYJt874bYcQVS3AxWhbryEBof-7V-rfr1Ovu-VMrNI2CJxg1nQqN3hiFC1TFf_BY18h2LV8RMN3iYX78A5u3Qxxr3Rjka7NvJ1ViITpD099BqlJWVapq5MeW7RZy5C8ZNv1FtICuGTA8mthEg&X-Amz-Signature=df80519a91811cea9c280b80c9b5273ce2b727e5d165c6e1d545d1833770b327": use of closed network connection

trying to push jais-13b-chat at 11Gb file size and average upload speed of 40Mbps. Network DNS set to 1.1.1.1

<!-- gh-comment-id:2780866211 --> @mmounirf commented on GitHub (Apr 5, 2025): constantly getting connection resets, with `ollama serve` showing the following log > `time=2025-04-05T17:34:36.128+02:00 level=INFO source=upload.go:269 msg="26194f6a49bb part 6 attempt 0 failed: Put \"https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/repositories/ollama/ollama/_uploads/c325bd23-12f0-46b2-a63c-a1e12b41e04f/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20250405%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20250405T152857Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&partNumber=7&uploadId=ANo4CVhk8pgLoHlbimEc1NK7-eBiICHlPSZVlA89VfPk6hb88c6FD76ObBukUZGID2AyfDClH3NxL1GW-NCOEIbcWkMM7GC1oxHppN3PajzCUVDWE__zc3lPy2X68fZZK5UZ5yVpDPhSxxWSN2py8Pfe48NmRd79FGlXGgW0qGSKqz1lq_GGnTG7Z4LGdkHiVHz2M7JnmUZf_UWypJbL95_9EUYAz4mYFpkBCeniE2RxJYESTcKyA0wvIMRVZ4RIblS24rfTA0U57MvJHt7axsUjQrU1z7BmmsOQLMUFOQ9iTE7hMabRnZ4P-FFW9b7aVajpMkXKjJbW4V2diGUOBVQ&X-Amz-Signature=42168a2df69415ab6ad00042392f241fb440ba69cd967e8c68d8e7144ec14ba6\": write tcp [2a02:908:1312:9480:18a6:242b:2d71:2c7]:50166->[2a06:98c1:58::12e]:443: use of closed network connection, retrying in 1s"` until finally halts with > `Error: max retries exceeded: Put "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/repositories/ollama/ollama/_uploads/c325bd23-12f0-46b2-a63c-a1e12b41e04f/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20250405%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20250405T152904Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&partNumber=12&uploadId=AONRve-Hmp6wbjRF6NqcNA7beUmX5lDJ8FMrjuKoqQ6MzyZp3Tg5PuLHbK1V4WplQ6LdJI2GUNHPnhnWTPdmObwYcbFV86gZg2_Ymyg9GEP7eYxXEC7GrlaTlsictoM3NPj2a6yaYH8BJWyWI8ItPuNMom51OkogNQTK5gRDCSsI9Q1Mv9v56QAyg3FLx5_3zpYmhIdYJt874bYcQVS3AxWhbryEBof-7V-rfr1Ovu-VMrNI2CJxg1nQqN3hiFC1TFf_BY18h2LV8RMN3iYX78A5u3Qxxr3Rjka7NvJ1ViITpD099BqlJWVapq5MeW7RZy5C8ZNv1FtICuGTA8mthEg&X-Amz-Signature=df80519a91811cea9c280b80c9b5273ce2b727e5d165c6e1d545d1833770b327": use of closed network connection` trying to push `jais-13b-chat` at 11Gb file size and average upload speed of 40Mbps. Network DNS set to `1.1.1.1`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1229