[GH-ISSUE #2913] Running with Ollama desktop vs ollama serve #1786

Closed
opened 2026-04-12 11:48:46 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @sofiafernandescd on GitHub (Mar 4, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2913

Hello, great work with this project!
I'm using Ollama models to feed my crewAI agents, but I noticed that:

  • when I turn on Ollama desktop, everything runs well
  • if I shut down Ollama desktop, and use "ollama serve" to start Ollama, my program outputs an error because it seems to be expecting a JSON request

How do I solve this? Because I wanted to run ollama in a GPU instance, and it's not so easy to get Ollama desktop running from there, so I was looking for an easy way to get the same behaviour.

Thank you so much!

Originally created by @sofiafernandescd on GitHub (Mar 4, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2913 Hello, great work with this project! I'm using Ollama models to feed my crewAI agents, but I noticed that: - when I turn on Ollama desktop, everything runs well - if I shut down Ollama desktop, and use "ollama serve" to start Ollama, my program outputs an error because it seems to be expecting a JSON request How do I solve this? Because I wanted to run ollama in a GPU instance, and it's not so easy to get Ollama desktop running from there, so I was looking for an easy way to get the same behaviour. Thank you so much!
Author
Owner

@easp commented on GitHub (Mar 4, 2024):

What OS are you using?

<!-- gh-comment-id:1977070265 --> @easp commented on GitHub (Mar 4, 2024): What OS are you using?
Author
Owner

@sofiafernandescd commented on GitHub (Mar 4, 2024):

@easp I'm on MacOS, and I also tried to serve ollama on a Koyeb instance. I just tested again and it seems to be working by now, in both. I don't know what caused this, but since it's solved, I'm closing the issue. Thank you

<!-- gh-comment-id:1977082798 --> @sofiafernandescd commented on GitHub (Mar 4, 2024): @easp I'm on MacOS, and I also tried to serve ollama on a Koyeb instance. I just tested again and it seems to be working by now, in both. I don't know what caused this, but since it's solved, I'm closing the issue. Thank you
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1786