[GH-ISSUE #6404] Error during API call: litellm.APIConnectionError: Ollama Error - {'error': 'error reading llm response: read tcp 127.0.0.1:5644->127.0.0.1:5600: wsarecv: An existing connection was forcibly closed by the remote host.'} #4022

Closed
opened 2026-04-12 14:53:55 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @720pixel on GitHub (Aug 18, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6404

What is the issue?

I'm using latest Aider + deepseek-coder-v2

I am facing this issue frequently. It works automatically after a while.

server.log

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.3.6

Originally created by @720pixel on GitHub (Aug 18, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6404 ### What is the issue? I'm using latest Aider + [deepseek-coder-v2](https://ollama.com/library/deepseek-coder-v2) I am facing this issue frequently. It works automatically after a while. [server.log](https://github.com/user-attachments/files/16648359/server.log) ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.6
GiteaMirror added the bug label 2026-04-12 14:53:55 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 18, 2024):

C:\a\ollama\ollama\llm\llama.cpp\src\llama.cpp:15104: Deepseek2 does not support K-shift

As per the error, there are problems with supporting this model. There are possible fixes from a couple of PRs in https://github.com/ollama/ollama/issues/5975.

<!-- gh-comment-id:2295260653 --> @rick-github commented on GitHub (Aug 18, 2024): ``` C:\a\ollama\ollama\llm\llama.cpp\src\llama.cpp:15104: Deepseek2 does not support K-shift ``` As per the error, there are problems with supporting this model. There are possible fixes from a couple of PRs in https://github.com/ollama/ollama/issues/5975.
Author
Owner

@rick-github commented on GitHub (Aug 18, 2024):

Workaround discussed in https://github.com/ggerganov/llama.cpp/issues/8862.

<!-- gh-comment-id:2295279901 --> @rick-github commented on GitHub (Aug 18, 2024): Workaround discussed in https://github.com/ggerganov/llama.cpp/issues/8862.
Author
Owner

@RealIndica commented on GitHub (Sep 16, 2024):

Also having the same issue, but using Msty.

<!-- gh-comment-id:2352143977 --> @RealIndica commented on GitHub (Sep 16, 2024): Also having the same issue, but using Msty.
Author
Owner

@rick-github commented on GitHub (Sep 16, 2024):

Have you tried the workaround in https://github.com/ollama/ollama/issues/6199#issuecomment-2295952982? Server logs would aid in debugging.

<!-- gh-comment-id:2352227625 --> @rick-github commented on GitHub (Sep 16, 2024): Have you tried the workaround in https://github.com/ollama/ollama/issues/6199#issuecomment-2295952982? [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) would aid in debugging.
Author
Owner

@RealIndica commented on GitHub (Sep 16, 2024):

Have you tried the workaround in #6199 (comment)? Server logs would aid in debugging.

I have tried setting the parameters to those values, but still doesn't work unfortunately.

<!-- gh-comment-id:2352385985 --> @RealIndica commented on GitHub (Sep 16, 2024): > Have you tried the workaround in [#6199 (comment)](https://github.com/ollama/ollama/issues/6199#issuecomment-2295952982)? [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) would aid in debugging. I have tried setting the parameters to those values, but still doesn't work unfortunately.
Author
Owner

@dhiltgen commented on GitHub (Oct 31, 2024):

Dup of #5975

<!-- gh-comment-id:2450546478 --> @dhiltgen commented on GitHub (Oct 31, 2024): Dup of #5975
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4022