[GH-ISSUE #9986] ollama runners crashing with wsarecv: An existing connection was forcibly closed by the remote host #6543

Closed
opened 2026-04-12 18:09:36 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @azizbtk on GitHub (Mar 25, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9986

What is the issue?

Unable to run model like qwen2.5 locally (v6.2.0) with the following error:
wsarecv: An existing connection was forcibly closed by the remote host

the latest version that works fine for me is 0.3.11

Relevant log output


OS

Windows

GPU

Intel

CPU

Intel

Ollama version

0.6.2

Originally created by @azizbtk on GitHub (Mar 25, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9986 ### What is the issue? Unable to run model like qwen2.5 locally (v6.2.0) with the following error: wsarecv: An existing connection was forcibly closed by the remote host the latest version that works fine for me is 0.3.11 ### Relevant log output ```shell ``` ### OS Windows ### GPU Intel ### CPU Intel ### Ollama version 0.6.2
GiteaMirror added the needs more infobug labels 2026-04-12 18:09:36 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 25, 2025):

Server logs may aid in debugging.

<!-- gh-comment-id:2752390465 --> @rick-github commented on GitHub (Mar 25, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) may aid in debugging.
Author
Owner

@Jacket1608 commented on GitHub (Mar 28, 2025):

My gemma3 is same

<!-- gh-comment-id:2761128784 --> @Jacket1608 commented on GitHub (Mar 28, 2025): My gemma3 is same
Author
Owner

@rick-github commented on GitHub (Mar 28, 2025):

Server logs may aid in debugging.

<!-- gh-comment-id:2761158362 --> @rick-github commented on GitHub (Mar 28, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) may aid in debugging.
Author
Owner

@Shtevlogs commented on GitHub (Mar 30, 2025):

I was running into this issue, on a AMD RX 7900 XT. Downloading the windows amd zip from directly from the releases page (https://github.com/ollama/ollama/releases) did the trick.

<!-- gh-comment-id:2764558030 --> @Shtevlogs commented on GitHub (Mar 30, 2025): I was running into this issue, on a AMD RX 7900 XT. Downloading the windows amd zip from directly from the releases page (https://github.com/ollama/ollama/releases) did the trick.
Author
Owner

@azizbtk commented on GitHub (Mar 30, 2025):

same issue even after downloading the zip and running ollama from there

Error: POST predict: Post "http://127.0.0.1:55406/completion": read tcp 127.0.0.1:55408->127.0.0.1:55406: wsarecv: An existing connection was forcibly closed by the remote host.

<!-- gh-comment-id:2764573611 --> @azizbtk commented on GitHub (Mar 30, 2025): same issue even after downloading the zip and running ollama from there Error: POST predict: Post "http://127.0.0.1:55406/completion": read tcp 127.0.0.1:55408->127.0.0.1:55406: wsarecv: An existing connection was forcibly closed by the remote host.
Author
Owner

@rick-github commented on GitHub (Mar 30, 2025):

Server logs may aid in debugging.

<!-- gh-comment-id:2764578338 --> @rick-github commented on GitHub (Mar 30, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) may aid in debugging.
Author
Owner

@katmandoo212 commented on GitHub (Apr 20, 2025):

server-2.log

You can see in the server log that the port number is way off, it should be 11434. If I revert back to 0.5.12 the 11434 port is used, but after that version, at least for Windows 10, it attempts to use a different port every execution.

<!-- gh-comment-id:2817106369 --> @katmandoo212 commented on GitHub (Apr 20, 2025): [server-2.log](https://github.com/user-attachments/files/19824862/server-2.log) You can see in the server log that the port number is way off, it should be 11434. If I revert back to 0.5.12 the 11434 port is used, but after that version, at least for Windows 10, it attempts to use a different port every execution.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6543