[GH-ISSUE #6366] Unable to Pull Model Manifest - "Get https://registry.ollama.ai/v2/library/llama3/manifests/latest: EOF" #3997

Closed
opened 2026-04-12 14:52:18 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @uestcxt on GitHub (Aug 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6366

What is the issue?

Description

I am experiencing an issue when trying to pull the llama3 model using the ollama CLI. The process fails with an "EOF" error. I have also tried pulling other models, but the same error occurs.
I have verified that DNS resolution works correctly, as I can resolve the registry.ollama.ai domain:

PS C:\Windows\System32> nslookup registry.ollama.ai
Server:  www.huaweimobilewifi.com
Address:  fe80::1a9e:2cff:fe47:b669

Non-authoritative answer:
Name:    registry.ollama.ai
Addresses:  2606:4700:3036::6815:4be3
           2606:4700:3034::ac43:b6e5
           104.21.75.227

Environment:

  • Ollama Version: 0.3.6
  • Operating System: Windows 11 (Version 10.0.22631, Build 22631)
  • GPU: NVIDIA RTX 4090 Laptop
  • Memory: 64 GB
  • CPU: Intel(R) Core(TM) i9-14900HX, 2200 MHz, 24 cores, 32 logical processors
PS C:\Windows\System32> ollama run llama3
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3/manifests/latest": EOF
PS C:\Windows\System32> ollama -v
ollama version is 0.3.6

server.log

time=2024-08-15T09:15:43.929+08:00 level=INFO source=images.go:1059 msg="request failed: Get \"https://registry.ollama.ai/v2/library/llama3/manifests/latest\": EOF"
[GIN] 2024/08/15 - 09:15:43 | 200 |   11.3628421s |       127.0.0.1 | POST     "/api/pull"
[GIN] 2024/08/15 - 09:16:02 | 200 |            0s |       127.0.0.1 | HEAD     "/"
[GIN] 2024/08/15 - 09:16:02 | 404 |            0s |       127.0.0.1 | POST     "/api/show"
time=2024-08-15T09:16:02.628+08:00 level=INFO source=images.go:1059 msg="request failed: Get \"https://registry.ollama.ai/v2/library/llama3/manifests/latest\": EOF"

It appears that the process is attempting to fetch the manifest but is encountering an EOF error. I have confirmed that my network connection is stable.

Could you please advise on how to resolve this issue?

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.3.6

Originally created by @uestcxt on GitHub (Aug 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6366 ### What is the issue? # Description I am experiencing an issue when trying to pull the llama3 model using the ollama CLI. The process fails with an "EOF" error. I have also tried pulling other models, but the same error occurs. I have verified that DNS resolution works correctly, as I can resolve the registry.ollama.ai domain: ``` PS C:\Windows\System32> nslookup registry.ollama.ai Server: www.huaweimobilewifi.com Address: fe80::1a9e:2cff:fe47:b669 Non-authoritative answer: Name: registry.ollama.ai Addresses: 2606:4700:3036::6815:4be3 2606:4700:3034::ac43:b6e5 104.21.75.227 ``` # Environment: - Ollama Version: 0.3.6 - Operating System: Windows 11 (Version 10.0.22631, Build 22631) - GPU: NVIDIA RTX 4090 Laptop - Memory: 64 GB - CPU: Intel(R) Core(TM) i9-14900HX, 2200 MHz, 24 cores, 32 logical processors ``` PS C:\Windows\System32> ollama run llama3 pulling manifest Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3/manifests/latest": EOF PS C:\Windows\System32> ollama -v ollama version is 0.3.6 ``` # server.log ``` time=2024-08-15T09:15:43.929+08:00 level=INFO source=images.go:1059 msg="request failed: Get \"https://registry.ollama.ai/v2/library/llama3/manifests/latest\": EOF" [GIN] 2024/08/15 - 09:15:43 | 200 | 11.3628421s | 127.0.0.1 | POST "/api/pull" [GIN] 2024/08/15 - 09:16:02 | 200 | 0s | 127.0.0.1 | HEAD "/" [GIN] 2024/08/15 - 09:16:02 | 404 | 0s | 127.0.0.1 | POST "/api/show" time=2024-08-15T09:16:02.628+08:00 level=INFO source=images.go:1059 msg="request failed: Get \"https://registry.ollama.ai/v2/library/llama3/manifests/latest\": EOF" ``` It appears that the process is attempting to fetch the manifest but is encountering an EOF error. I have confirmed that my network connection is stable. Could you please advise on how to resolve this issue? ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.6
GiteaMirror added the bugneeds more info labels 2026-04-12 14:52:18 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 15, 2024):

This is probably a transient issue, there are reports that Cloudfare has been a bit flaky.

<!-- gh-comment-id:2291014656 --> @rick-github commented on GitHub (Aug 15, 2024): This is probably a transient issue, there are reports that Cloudfare has been a bit flaky.
Author
Owner

@pdevine commented on GitHub (Sep 1, 2024):

@uestcxt can you try again? are you still seeing the issue?

<!-- gh-comment-id:2323542424 --> @pdevine commented on GitHub (Sep 1, 2024): @uestcxt can you try again? are you still seeing the issue?
Author
Owner

@R4ZZ3 commented on GitHub (Sep 11, 2024):

I am having similar issues:
image

image
GGUF files from here: https://huggingface.co/mradermacher/Ahma-3B-Instruct-GGUF/tree/main
Original model: https://huggingface.co/Finnish-NLP/Ahma-3B-Instruct

<!-- gh-comment-id:2342666862 --> @R4ZZ3 commented on GitHub (Sep 11, 2024): I am having similar issues: ![image](https://github.com/user-attachments/assets/098462a2-3eec-4aa6-8b9c-52644a71929d) ![image](https://github.com/user-attachments/assets/a3ee5972-832a-4a44-8970-991a115ee861) GGUF files from here: https://huggingface.co/mradermacher/Ahma-3B-Instruct-GGUF/tree/main Original model: https://huggingface.co/Finnish-NLP/Ahma-3B-Instruct
Author
Owner

@pdevine commented on GitHub (Sep 17, 2024):

@R4ZZ3 I'm not sure how that's related to this issue? I'm assuming you're having problems running a bad GGUF file, but this issue is for ollama pull.

I'm going to go ahead and close this issue though. We can reopen if it keeps happening.

<!-- gh-comment-id:2356265199 --> @pdevine commented on GitHub (Sep 17, 2024): @R4ZZ3 I'm not sure how that's related to this issue? I'm assuming you're having problems running a bad GGUF file, but this issue is for `ollama pull`. I'm going to go ahead and close this issue though. We can reopen if it keeps happening.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3997