[GH-ISSUE #9441] 404 request #6156

Closed
opened 2026-04-12 17:30:21 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @zkw1813133460 on GitHub (Mar 1, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9441

What is the issue?

[GIN] 2025/03/01 - 21:33:29 | 404 | 525.2µs | 127.0.0.1 | POST "/api/show"
time=2025-03-01T21:33:30.237+08:00 level=INFO source=images.go:669 msg="request failed: Get "https://registry.ollama.ai/v2/library/llama3.2/manifests/latest": read tcp 192.168.124.8:50325->104.21.75.227:443: wsarecv: An existing connection was forcibly closed by the remote host."
[GIN] 2025/03/01 - 21:33:30 | 200 | 327.7631ms | 127.0.0.1 | POST "/api/pull"

Relevant log output


OS

Windows 11

GPU

No response

CPU

No response

Ollama version

0.5.13

Originally created by @zkw1813133460 on GitHub (Mar 1, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9441 ### What is the issue? [GIN] 2025/03/01 - 21:33:29 | 404 | 525.2µs | 127.0.0.1 | POST "/api/show" time=2025-03-01T21:33:30.237+08:00 level=INFO source=images.go:669 msg="request failed: Get \"https://registry.ollama.ai/v2/library/llama3.2/manifests/latest\": read tcp 192.168.124.8:50325->104.21.75.227:443: wsarecv: An existing connection was forcibly closed by the remote host." [GIN] 2025/03/01 - 21:33:30 | 200 | 327.7631ms | 127.0.0.1 | POST "/api/pull" ### Relevant log output ```shell ``` ### OS Windows 11 ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.5.13
GiteaMirror added the bug label 2026-04-12 17:30:21 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 1, 2025):

The 404 is because the command ollama show llama3.2 was run and the model hasn't been downloaded. The subsequent ollama pull llama3.2 failed because of a network or server issue. Re-run ollama pull llama3.2 and it might work. If it doesn't, run the following and post the results:

curl -v https://registry.ollama.ai/v2/library/llama3.2/manifests/latest
<!-- gh-comment-id:2692246927 --> @rick-github commented on GitHub (Mar 1, 2025): The 404 is because the command `ollama show llama3.2` was run and the model hasn't been downloaded. The subsequent `ollama pull llama3.2` failed because of a network or server issue. Re-run `ollama pull llama3.2` and it might work. If it doesn't, run the following and post the results: ``` curl -v https://registry.ollama.ai/v2/library/llama3.2/manifests/latest ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6156