[GH-ISSUE #11408] No response from the models #33290

Closed
opened 2026-04-22 15:49:46 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @pulinagrawal on GitHub (Jul 14, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11408

What is the issue?

Hi

I have Ollama serving. When I use either the python-sdk or cli, after running

ollama run deepseek-r1:8b (same issue with gemma3)

hello
:.

Keeps loading forever.

No GPU machine with 64GB memory.

Relevant log output


OS

Linux

GPU

Other

CPU

Other

Ollama version

0.9.6

Originally created by @pulinagrawal on GitHub (Jul 14, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11408 ### What is the issue? Hi I have Ollama serving. When I use either the python-sdk or cli, after running ollama run deepseek-r1:8b (same issue with gemma3) >> hello :. Keeps loading forever. No GPU machine with 64GB memory. ### Relevant log output ```shell ``` ### OS Linux ### GPU Other ### CPU Other ### Ollama version 0.9.6
GiteaMirror added the bug label 2026-04-22 15:49:46 -05:00
Author
Owner

@rick-github commented on GitHub (Jul 14, 2025):

Server logs will aid in debugging.

<!-- gh-comment-id:3068497925 --> @rick-github commented on GitHub (Jul 14, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) will aid in debugging.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#33290