[GH-ISSUE #3112] Windows Error:pull model manifest return wsarecv: An existing connection was forcibly closed by the remote host. #27672

Closed
opened 2026-04-22 05:11:43 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @heimu-liu on GitHub (Mar 13, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3112

i can't down the model:
app.log
server.log

PS C:\Users\heimu\AppData\Local\Ollama> ollama pull llama2 pulling manifest Error: pull model manifest: Get "https://ollama.com/token?nonce=A-QmGZFS0za-Kv0GKrDy3Q&scope=repository%!A(MISSING)library%!F(MISSING)llama2%!A(MISSING)pull&service=ollama.com&ts=1710339130": read tcp 192.168.247.214:56798->34.120.132.20:443: wsarecv: An existing connection was forcibly closed by the remote host. PS C:\Users\heimu\AppData\Local\Ollama>

Originally created by @heimu-liu on GitHub (Mar 13, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3112 i can't down the model: [app.log](https://github.com/ollama/ollama/files/14589977/app.log) [server.log](https://github.com/ollama/ollama/files/14589978/server.log) `PS C:\Users\heimu\AppData\Local\Ollama> ollama pull llama2 pulling manifest Error: pull model manifest: Get "https://ollama.com/token?nonce=A-QmGZFS0za-Kv0GKrDy3Q&scope=repository%!A(MISSING)library%!F(MISSING)llama2%!A(MISSING)pull&service=ollama.com&ts=1710339130": read tcp 192.168.247.214:56798->34.120.132.20:443: wsarecv: An existing connection was forcibly closed by the remote host. PS C:\Users\heimu\AppData\Local\Ollama>`
GiteaMirror added the networkingwindows labels 2026-04-22 05:11:43 -05:00
Author
Owner

@BruceMacD commented on GitHub (Mar 15, 2024):

Hi @heimu-liu this seems similar to #1149, are you using ollama behind a proxy? If so checkout the proxy configuration guide:
https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy

Hopefully that helps, let me know if that is the case.

<!-- gh-comment-id:1999449339 --> @BruceMacD commented on GitHub (Mar 15, 2024): Hi @heimu-liu this seems similar to #1149, are you using ollama behind a proxy? If so checkout the proxy configuration guide: https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy Hopefully that helps, let me know if that is the case.
Author
Owner

@heimu-liu commented on GitHub (Mar 16, 2024):

Contributor

Thank you very much for your reply!
I've tested it with both wireless and LTE, as well as proxy and non-proxy.
still get the same error.

<!-- gh-comment-id:2001448954 --> @heimu-liu commented on GitHub (Mar 16, 2024): > Contributor Thank you very much for your reply! I've tested it with both wireless and LTE, as well as proxy and non-proxy. still get the same error.
Author
Owner

@dhiltgen commented on GitHub (Mar 21, 2024):

Similar to #3191 and #2448

<!-- gh-comment-id:2012002487 --> @dhiltgen commented on GitHub (Mar 21, 2024): Similar to #3191 and #2448
Author
Owner

@Felixchen1024 commented on GitHub (Mar 28, 2024):

How can I fix this?

<!-- gh-comment-id:2024781325 --> @Felixchen1024 commented on GitHub (Mar 28, 2024): How can I fix this?
Author
Owner

@mxyng commented on GitHub (Mar 28, 2024):

Looking at the request both from the error in your description and the server access logs, it appears the request is getting mangled somehow.

This is being sent

https://ollama.com/token?nonce=A-QmGZFS0za-Kv0GKrDy3Q&scope=repository%!A(MISSING)library%!F(MISSING)llama2%!A(MISSING)pull&service=ollama.com&ts=1710339130

instead of this

https://ollama.com/token?nonce=A-QmGZFS0za-Kv0GKrDy3Q&scope=repository%3Alibrary%2Fllama2%3Apull&service=ollama.com&ts=1710339130

This should be caused by passing the real URL into fmt.Printf or similar functions: example.

Is it possible there's a man-in-the-middle proxy mangling the request?

<!-- gh-comment-id:2026153066 --> @mxyng commented on GitHub (Mar 28, 2024): Looking at the request both from the error in your description and the server access logs, it appears the request is getting mangled somehow. This is being sent ``` https://ollama.com/token?nonce=A-QmGZFS0za-Kv0GKrDy3Q&scope=repository%!A(MISSING)library%!F(MISSING)llama2%!A(MISSING)pull&service=ollama.com&ts=1710339130 ``` instead of this ``` https://ollama.com/token?nonce=A-QmGZFS0za-Kv0GKrDy3Q&scope=repository%3Alibrary%2Fllama2%3Apull&service=ollama.com&ts=1710339130 ``` This should be caused by passing the real URL into `fmt.Printf` or similar functions: [example](https://go.dev/play/p/V0sMmL70jIs). Is it possible there's a man-in-the-middle proxy mangling the request?
Author
Owner

@chopeen commented on GitHub (Mar 28, 2024):

My experiments in #2448 suggest that a correct URL is called and it is only mangled later, when logged.


I am facing the same problem when connected to a corporate network (Ubuntu 22.04, Ollama installed with Homebrew):

$ ollama run llava

pulling manifest 
Error: pull model manifest: Get "https://ollama.com/token?nonce=SW96RgmctcQmHJ37NXJ8KQ&scope=repository%!A(MISSING)library%!F(MISSING)llava%!A(MISSING)pull&service=ollama.com&ts=1711106784": read tcp 10.144.68.189:40600->34.120.132.20:443: read: connection reset by peer

The problem first appeared in v0.1.28.

When I downgrade to v0.1.27, I am able to pull the models again.

<!-- gh-comment-id:2026249945 --> @chopeen commented on GitHub (Mar 28, 2024): My experiments in #2448 suggest that a correct URL is called and it is only mangled later, when logged. --- I am facing the same problem **when connected to a corporate network** (Ubuntu 22.04, Ollama installed with Homebrew): ``` $ ollama run llava pulling manifest Error: pull model manifest: Get "https://ollama.com/token?nonce=SW96RgmctcQmHJ37NXJ8KQ&scope=repository%!A(MISSING)library%!F(MISSING)llava%!A(MISSING)pull&service=ollama.com&ts=1711106784": read tcp 10.144.68.189:40600->34.120.132.20:443: read: connection reset by peer ``` The problem first appeared in `v0.1.28`. When I downgrade to `v0.1.27`, I am able to pull the models again.
Author
Owner

@heimu-liu commented on GitHub (Mar 29, 2024):

My experiments in #2448 suggest that a correct URL is called and it is only mangled later, when logged.

I am facing the same problem when connected to a corporate network (Ubuntu 22.04, Ollama installed with Homebrew):

$ ollama run llava

pulling manifest 
Error: pull model manifest: Get "https://ollama.com/token?nonce=SW96RgmctcQmHJ37NXJ8KQ&scope=repository%!A(MISSING)library%!F(MISSING)llava%!A(MISSING)pull&service=ollama.com&ts=1711106784": read tcp 10.144.68.189:40600->34.120.132.20:443: read: connection reset by peer

The problem first appeared in v0.1.28.

When I downgrade to v0.1.27, I am able to pull the models again.

Thank you Very Much!

<!-- gh-comment-id:2026564134 --> @heimu-liu commented on GitHub (Mar 29, 2024): > My experiments in #2448 suggest that a correct URL is called and it is only mangled later, when logged. > > I am facing the same problem **when connected to a corporate network** (Ubuntu 22.04, Ollama installed with Homebrew): > > ``` > $ ollama run llava > > pulling manifest > Error: pull model manifest: Get "https://ollama.com/token?nonce=SW96RgmctcQmHJ37NXJ8KQ&scope=repository%!A(MISSING)library%!F(MISSING)llava%!A(MISSING)pull&service=ollama.com&ts=1711106784": read tcp 10.144.68.189:40600->34.120.132.20:443: read: connection reset by peer > ``` > > The problem first appeared in `v0.1.28`. > > When I downgrade to `v0.1.27`, I am able to pull the models again. Thank you Very Much!
Author
Owner

@heimu-liu commented on GitHub (Mar 29, 2024):

Problem solved by downgrade to v0.1.27

<!-- gh-comment-id:2026586878 --> @heimu-liu commented on GitHub (Mar 29, 2024): Problem solved by downgrade to v0.1.27
Author
Owner

@chopeen commented on GitHub (Mar 29, 2024):

@heimu-liu Downgrading is merely a workaround, but hopefully the problem can actually be solved in future version of Ollama.

@mxyng Do you prefer to reopen this issue or should I create a new one with all the information known so far?

<!-- gh-comment-id:2026844447 --> @chopeen commented on GitHub (Mar 29, 2024): @heimu-liu Downgrading is merely a workaround, but hopefully the problem can actually be solved in future version of Ollama. @mxyng Do you prefer to reopen this issue **or** should I create a new one with all the information known so far?
Author
Owner

@chopeen commented on GitHub (Apr 2, 2024):

I opened new issue #3452.

<!-- gh-comment-id:2031592911 --> @chopeen commented on GitHub (Apr 2, 2024): I opened new issue #3452.
Author
Owner

@nitinkr0411 commented on GitHub (Apr 23, 2024):

I'm facing the same issue with the latest version of ollama. Does not work in corporate proxy anymore.
Downgrading to v0.1.27 makes it work again.

<!-- gh-comment-id:2071437369 --> @nitinkr0411 commented on GitHub (Apr 23, 2024): I'm facing the same issue with the latest version of ollama. Does not work in corporate proxy anymore. Downgrading to v0.1.27 makes it work again.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27672