[GH-ISSUE #3786] My internet is too slow to download the model #28097

Open
opened 2026-04-22 05:55:08 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @skystar7 on GitHub (Apr 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3786

What is the issue?

I keep getting the same error

ollama run llama3
pulling manifest
pulling 00e1317cbf74... 1% ▕ ▏ 28 MB/4.7 GB 61 KB/s 20h58m
Error: max retries exceeded: Get "00e1317cbf/data!F(MISSING)20240420%!F(MISSING)auto%!F(MISSING)s3%!F(MISSING)aws4_request&X-Amz-Date=20240420T204533Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=8e4efa642799d32a9e502990baa74226f9f0f740b2cdb7ed0bd6b6741e0106df": net/http: TLS handshake timeout

Is there a way to download llama3 externally and then use it with ollama?

Many thanks

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.1.32

Originally created by @skystar7 on GitHub (Apr 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3786 ### What is the issue? I keep getting the same error > ollama run llama3 > pulling manifest > pulling 00e1317cbf74... 1% ▕ ▏ 28 MB/4.7 GB 61 KB/s 20h58m > Error: max retries exceeded: Get "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/00/00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%!F(MISSING)20240420%!F(MISSING)auto%!F(MISSING)s3%!F(MISSING)aws4_request&X-Amz-Date=20240420T204533Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=8e4efa642799d32a9e502990baa74226f9f0f740b2cdb7ed0bd6b6741e0106df": net/http: TLS handshake timeout Is there a way to download llama3 externally and then use it with ollama? Many thanks ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.32
GiteaMirror added the networkingfeature request labels 2026-04-22 05:55:08 -05:00
Author
Owner

@dhiltgen commented on GitHub (May 2, 2024):

We don't currently have a formal way to do this, but you can take a look at where the models are stored and transfer the files manually. https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored

<!-- gh-comment-id:2089327307 --> @dhiltgen commented on GitHub (May 2, 2024): We don't currently have a formal way to do this, but you can take a look at where the models are stored and transfer the files manually. https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored
Author
Owner

@Alchemistqqqq commented on GitHub (May 8, 2024):

I have also encountered the same problem, may I ask if you have solved it and have a suitable solution。

<!-- gh-comment-id:2100006694 --> @Alchemistqqqq commented on GitHub (May 8, 2024): I have also encountered the same problem, may I ask if you have solved it and have a suitable solution。
Author
Owner

@edwin2jiang commented on GitHub (May 12, 2024):

I have meet this problem too. I think cloudflare may cause this problem which China's network may be blocked.

<!-- gh-comment-id:2106180609 --> @edwin2jiang commented on GitHub (May 12, 2024): I have meet this problem too. I think cloudflare may cause this problem which China's network may be blocked.
Author
Owner

@bmizerany commented on GitHub (May 13, 2024):

Hi everyone, we're working on these issues and experimenting with a new solution with our partners at the edge. If you have time, please consider helping us test if this helps! We're collecting as much feedback as possible and could use your help!

https://github.com/ollama/ollama/issues/1736#issuecomment-2102983113

<!-- gh-comment-id:2108925974 --> @bmizerany commented on GitHub (May 13, 2024): Hi everyone, we're working on these issues and experimenting with a new solution with our partners at the edge. If you have time, please consider helping us test if this helps! We're collecting as much feedback as possible and could use your help! https://github.com/ollama/ollama/issues/1736#issuecomment-2102983113
Author
Owner

@tfly2024 commented on GitHub (Sep 28, 2024):

I have the same issue, but when I pull with Command Prompt (CMD), it reports an error: 'Error: pull model manifest: open C:\Users\Administrator.ollama\id_ed25519' - The system cannot find the file specified. After rechecking, I found that 'id_ed25519' and 'id_ed25519.pub' have been deleted accidentally.

<!-- gh-comment-id:2380553865 --> @tfly2024 commented on GitHub (Sep 28, 2024): I have the same issue, but when I pull with Command Prompt (CMD), it reports an error: 'Error: pull model manifest: open C:\Users\Administrator.ollama\id_ed25519' - The system cannot find the file specified. After rechecking, I found that 'id_ed25519' and 'id_ed25519.pub' have been deleted accidentally.
Author
Owner

@tfly2024 commented on GitHub (Sep 28, 2024):

I have the same issue, but when I pull with Command Prompt (CMD), it reports an error: 'Error: pull model manifest: open C:\Users\Administrator.ollama\id_ed25519' - The system cannot find the file specified. After rechecking, I found that 'id_ed25519' and 'id_ed25519.pub' have been deleted accidentally.

Running the command when I execute open-webui with docker also gives the error. All programs is window64

<!-- gh-comment-id:2380556569 --> @tfly2024 commented on GitHub (Sep 28, 2024): > I have the same issue, but when I pull with Command Prompt (CMD), it reports an error: 'Error: pull model manifest: open C:\Users\Administrator.ollama\id_ed25519' - The system cannot find the file specified. After rechecking, I found that 'id_ed25519' and 'id_ed25519.pub' have been deleted accidentally. Running the command when I execute open-webui with docker also gives the error. All programs is window64
Author
Owner

@blackcatpolice commented on GitHub (Jan 28, 2025):

I found the answer : ollama run llama3.2 ""

It can be downloaded,but it also display error: Error: max retries exceeded: EOF

And I manually ollama run llama3.2 "" again , it can be downloading again

from : https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored

Image

<!-- gh-comment-id:2618426781 --> @blackcatpolice commented on GitHub (Jan 28, 2025): I found the answer : ollama run llama3.2 "" It can be downloaded,but it also display error: Error: max retries exceeded: EOF And I manually ollama run llama3.2 "" again , it can be downloading again from : https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored ![Image](https://github.com/user-attachments/assets/0cfa4f33-db4b-49f8-b673-28000e1750f1)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28097