[GH-ISSUE #9064] Error: an error was encountered while running the model: unexpected EOF #67958

Closed
opened 2026-05-04 12:07:17 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @feihongloveworld on GitHub (Feb 13, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9064

What is the issue?

"When running the command ollama run deepseek-r1:671b and asking a question, the response was interrupted by the error shown in the title."

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @feihongloveworld on GitHub (Feb 13, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9064 ### What is the issue? "When running the command `ollama run deepseek-r1:671b` and asking a question, the response was interrupted by the error shown in the title." ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-05-04 12:07:17 -05:00
Author
Owner

@rick-github commented on GitHub (Feb 13, 2025):

Server logs would aid in debugging, but probably https://github.com/ollama/ollama/issues/5975.

<!-- gh-comment-id:2655955747 --> @rick-github commented on GitHub (Feb 13, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) would aid in debugging, but probably https://github.com/ollama/ollama/issues/5975.
Author
Owner

@g2dgaming commented on GitHub (Feb 22, 2025):

Same issue
\

<!-- gh-comment-id:2676408621 --> @g2dgaming commented on GitHub (Feb 22, 2025): Same issue \
Author
Owner

@rick-github commented on GitHub (Feb 22, 2025):

Server logs would aid in debugging, but probably https://github.com/ollama/ollama/issues/5975.

<!-- gh-comment-id:2676409093 --> @rick-github commented on GitHub (Feb 22, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) would aid in debugging, but probably https://github.com/ollama/ollama/issues/5975.
Author
Owner

@LilaKen commented on GitHub (Feb 26, 2025):

set some parameters can solve this problem, just like "num_predict" and "num_ctx", but it require more cuda memory, and the follow is my setting. Hope this can help u.
"options": {
"seed": 42,
"num_predict": 8192,
"repeat_penalty": 1.5,
"num_ctx": 24576
}

<!-- gh-comment-id:2684827357 --> @LilaKen commented on GitHub (Feb 26, 2025): set some parameters can solve this problem, just like "num_predict" and "num_ctx", but it require more cuda memory, and the follow is my setting. Hope this can help u. "options": { "seed": 42, "num_predict": 8192, "repeat_penalty": 1.5, "num_ctx": 24576 }
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#67958