[GH-ISSUE #3212] ollama pull modelName Error #64017

Closed
opened 2026-05-03 15:52:31 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @ZPLSSSTD on GitHub (Mar 18, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3212

What is the issue?

I once accidentally installed Ollama: 7b successfully.Afterwards, I executed the command olama pull llama2. But there was an error, and the error message is as follows
pulling manifest Error: pull model manifest: Get "https://ollama.com/token?nonce=qKzQl7GvJl7HVA-mW_-3Ow&scope=repository%!A(MISSING)library%!F(MISSING)llama2%!A(MISSING)pull&service=ollama.com&ts=1710731810": read tcp 192.168.4.190:13291->34.120.132.20:443: wsarecv: An existing connection was forcibly closed by the remote host.

I searched for the issue, but the answer inside cannot solve my problem. I have tried restarting my computer or changing my WiFi, but it still doesn't work.

What did you expect to see?

I hope to be able to install the model normally

Steps to reproduce

No response

Are there any recent changes that introduced the issue?

No response

OS

Windows

Architecture

amd64

Platform

No response

Ollama version

0.1.29

GPU

Nvidia

GPU info

Nvidia Geforce RTX 2060

CPU

AMD

Other software

No response

Originally created by @ZPLSSSTD on GitHub (Mar 18, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3212 ### What is the issue? I once accidentally installed Ollama: 7b successfully.Afterwards, I executed the command olama pull llama2. But there was an error, and the error message is as follows ` pulling manifest Error: pull model manifest: Get "https://ollama.com/token?nonce=qKzQl7GvJl7HVA-mW_-3Ow&scope=repository%!A(MISSING)library%!F(MISSING)llama2%!A(MISSING)pull&service=ollama.com&ts=1710731810": read tcp 192.168.4.190:13291->34.120.132.20:443: wsarecv: An existing connection was forcibly closed by the remote host. ` I searched for the issue, but the answer inside cannot solve my problem. I have tried restarting my computer or changing my WiFi, but it still doesn't work. ### What did you expect to see? I hope to be able to install the model normally ### Steps to reproduce _No response_ ### Are there any recent changes that introduced the issue? _No response_ ### OS Windows ### Architecture amd64 ### Platform _No response_ ### Ollama version 0.1.29 ### GPU Nvidia ### GPU info Nvidia Geforce RTX 2060 ### CPU AMD ### Other software _No response_
GiteaMirror added the networking label 2026-05-03 15:52:31 -05:00
Author
Owner

@BruceMacD commented on GitHub (Mar 21, 2024):

Hi @ZPLSSSTD, sorry you're hitting this issue. This looks similar to #2448, you may be having this issue due to a proxy or firewall.

<!-- gh-comment-id:2011822843 --> @BruceMacD commented on GitHub (Mar 21, 2024): Hi @ZPLSSSTD, sorry you're hitting this issue. This looks similar to #2448, you may be having this issue due to a proxy or firewall.
Author
Owner

@mxyng commented on GitHub (Mar 28, 2024):

duplicate of #3112

<!-- gh-comment-id:2026094009 --> @mxyng commented on GitHub (Mar 28, 2024): duplicate of #3112
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64017