[GH-ISSUE #2254] No response from ollama #1292

Closed
opened 2026-04-12 11:06:36 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @caibirdme on GitHub (Jan 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2254

No response from ollama

curl -X POST -d '{"model":"llama2", "messages":[{"role":"user","content":"why the weather in winter is so cold?"}], "stream":false}' 127.0.0.1:11434/api/chat

Here's the ollama list

llama2:latest   78e26419b446    3.8 GB  4 hours ago   
llava:latest    cd3274b81a85    4.5 GB  56 minutes ago

And when I use top to see the cpu&mem usage, ollama seems not working, the cpu&mem is very low

Originally created by @caibirdme on GitHub (Jan 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2254 No response from ollama ``` curl -X POST -d '{"model":"llama2", "messages":[{"role":"user","content":"why the weather in winter is so cold?"}], "stream":false}' 127.0.0.1:11434/api/chat ``` Here's the `ollama list` ``` llama2:latest 78e26419b446 3.8 GB 4 hours ago llava:latest cd3274b81a85 4.5 GB 56 minutes ago ``` And when I use top to see the cpu&mem usage, ollama seems not working, the cpu&mem is very low
Author
Owner

@easp commented on GitHub (Jan 29, 2024):

Are you using the latest version of ollama? Earlier versions could become un-responsive.

Does the ollama cli itself work?

<!-- gh-comment-id:1915423149 --> @easp commented on GitHub (Jan 29, 2024): Are you using the latest version of ollama? Earlier versions could become un-responsive. Does the ollama cli itself work?
Author
Owner

@caibirdme commented on GitHub (Jan 30, 2024):

@easp I'm using the latest 0.1.22.

  1. ollama run llama2, it works.
  2. write my own code to access ollama and load llava, it works but after I send some images, it responses internal error.
  3. Then neither ollama run nor curl doesn't work
<!-- gh-comment-id:1915895959 --> @caibirdme commented on GitHub (Jan 30, 2024): @easp I'm using the latest 0.1.22. 1. `ollama run llama2`, it works. 2. write my own code to access ollama and load llava, it works but after I send some images, it responses `internal error`. 3. Then neither `ollama run` nor curl doesn't work
Author
Owner

@bellowswang commented on GitHub (Feb 13, 2024):

Having the same issue.

<!-- gh-comment-id:1942745354 --> @bellowswang commented on GitHub (Feb 13, 2024): Having the same issue.
Author
Owner

@jmorganca commented on GitHub (Feb 20, 2024):

This issue should be fixed as of 0.1.25 – but please let me know if it isn't (and if so, would it be possible to share the prompt / image formats you used?) Thanks so much!

<!-- gh-comment-id:1953457677 --> @jmorganca commented on GitHub (Feb 20, 2024): This issue should be fixed as of 0.1.25 – but please let me know if it isn't (and if so, would it be possible to share the prompt / image formats you used?) Thanks so much!
Author
Owner

@KUKARAF commented on GitHub (Apr 21, 2024):

Issue still present on Debian 12

<!-- gh-comment-id:2068087128 --> @KUKARAF commented on GitHub (Apr 21, 2024): Issue still present on Debian 12
Author
Owner

@ploncker commented on GitHub (Aug 7, 2024):

Issue still present on M1 mac

<!-- gh-comment-id:2272489812 --> @ploncker commented on GitHub (Aug 7, 2024): Issue still present on M1 mac
Author
Owner

@augusto-rehfeldt commented on GitHub (Sep 22, 2024):

Issue still present of windows 11 with any model of any size.

<!-- gh-comment-id:2366938756 --> @augusto-rehfeldt commented on GitHub (Sep 22, 2024): Issue still present of windows 11 with any model of any size.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1292