[GH-ISSUE #4314] the ai model downloading not working #2691

Closed
opened 2026-04-12 13:00:55 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @Tochange143 on GitHub (May 10, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4314

What is the issue?

ollama run mistral:text
pulling manifest
Error: Head "5b5c2a563a/data!F(MISSING)20240510%!F(MISSING)auto%!F(MISSING)s3%!F(MISSING)aws4_request&X-Amz-Date=20240510T160627Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=21ab36ad0c30857906a5f710d6d26dd63c905bd84ba24e580f43dff8b8b96133": dial tcp: lookup dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com: no such host

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

v0.1.34

Originally created by @Tochange143 on GitHub (May 10, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4314 ### What is the issue? ollama run mistral:text pulling manifest Error: Head "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/5b/5b5c2a563a287aa9bf9be7499fe7e0630add02089be3f50ee494087f67683fbb/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%!F(MISSING)20240510%!F(MISSING)auto%!F(MISSING)s3%!F(MISSING)aws4_request&X-Amz-Date=20240510T160627Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=21ab36ad0c30857906a5f710d6d26dd63c905bd84ba24e580f43dff8b8b96133": dial tcp: lookup dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com: no such host ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version v0.1.34
GiteaMirror added the bug label 2026-04-12 13:00:55 -05:00
Author
Owner

@pdevine commented on GitHub (May 14, 2024):

are you behind a corporate proxy/firewall?

Also, are you using wsl2 or is it just plain Windows?

<!-- gh-comment-id:2110937544 --> @pdevine commented on GitHub (May 14, 2024): are you behind a corporate proxy/firewall? Also, are you using wsl2 or is it just plain Windows?
Author
Owner

@Tochange143 commented on GitHub (May 15, 2024):

are you behind a corporate proxy/firewall?

Also, are you using wsl2 or is it just plain Windows?

I did not use any proxy or firewall. I use my personal laptop. I have downloaded many modes. This time model download process failed more than 5 time. After that I started getting this error message and now I can't able any model. To solve the issue I uninstalled the ollama and reinstall again but that not work.

<!-- gh-comment-id:2112871106 --> @Tochange143 commented on GitHub (May 15, 2024): > are you behind a corporate proxy/firewall? > > Also, are you using wsl2 or is it just plain Windows? I did not use any proxy or firewall. I use my personal laptop. I have downloaded many modes. This time model download process failed more than 5 time. After that I started getting this error message and now I can't able any model. To solve the issue I uninstalled the ollama and reinstall again but that not work.
Author
Owner

@pdevine commented on GitHub (May 16, 2024):

I don't think it's a problem with Ollama. I think it's almost certainly a Windows issue w/ DNS, but I'm not sure what's causing it.

<!-- gh-comment-id:2116195493 --> @pdevine commented on GitHub (May 16, 2024): I don't _think_ it's a problem with Ollama. I think it's almost certainly a Windows issue w/ DNS, but I'm not sure what's causing it.
Author
Owner

@pdevine commented on GitHub (May 24, 2024):

@Tochage143 Did you figure out what was wrong?

<!-- gh-comment-id:2129867558 --> @pdevine commented on GitHub (May 24, 2024): @Tochage143 Did you figure out what was wrong?
Author
Owner

@Tochange143 commented on GitHub (May 25, 2024):

For the past three days, I've been trying to run ollama run mistral, but I keep encountering errors. When I ran the command as an administrator, I was able to download the AI model successfully. After that, the errors stopped occurring. Previously, I resolved the issue by uninstalling Ollama and clearing the cache.

<!-- gh-comment-id:2131273852 --> @Tochange143 commented on GitHub (May 25, 2024): For the past three days, I've been trying to run ollama run mistral, but I keep encountering errors. When I ran the command as an administrator, I was able to download the AI model successfully. After that, the errors stopped occurring. Previously, I resolved the issue by uninstalling Ollama and clearing the cache.
Author
Owner

@vishnuvardhan2005 commented on GitHub (Jan 25, 2025):

Changing the DNS IPs and flushdns resolved the issue for me

<!-- gh-comment-id:2613832095 --> @vishnuvardhan2005 commented on GitHub (Jan 25, 2025): Changing the DNS IPs and flushdns resolved the issue for me
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2691