[GH-ISSUE #1467] REST API : /api/chat endpoint not working #47302

Closed
opened 2026-04-28 03:32:34 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @slovanos on GitHub (Dec 11, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1467

Refering to the the examples of the main page:

Generate a response: Works perfectly

curl http://localhost:11434/api/generate -d '{
  "model": "llama2",
  "prompt":"Why is the sky blue?"
}'

Chat with a model: Not Working

Response is "404 page not found"

curl http://localhost:11434/api/chat -d '{
  "model": "llama2",
  "messages": [
    { "role": "user", "content": "why is the sky blue?" }
  ]
}'

I tried restarting the server and also with mistral model and I am getting the same result.

Originally created by @slovanos on GitHub (Dec 11, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1467 Refering to the the examples of the main page: ## Generate a response: Works perfectly ``` curl http://localhost:11434/api/generate -d '{ "model": "llama2", "prompt":"Why is the sky blue?" }' ``` ## Chat with a model: Not Working ### Response is "404 page not found" ``` curl http://localhost:11434/api/chat -d '{ "model": "llama2", "messages": [ { "role": "user", "content": "why is the sky blue?" } ] }' ``` I tried restarting the server and also with mistral model and I am getting the same result.
Author
Owner

@jmorganca commented on GitHub (Dec 11, 2023):

Hi @slovanos! Which version of Ollama are you on? (you can check with ollama -v)

The chat api is available in 0.1.14 or later (just released yesterday :-). To upgrade simply re-download Ollama: https://ollama.ai/ on Linux or macOS. Hope this helps!

<!-- gh-comment-id:1850485175 --> @jmorganca commented on GitHub (Dec 11, 2023): Hi @slovanos! Which version of Ollama are you on? (you can check with `ollama -v`) The chat api is available in 0.1.14 or later (just released yesterday :-). To upgrade simply re-download Ollama: https://ollama.ai/ on Linux or macOS. Hope this helps!
Author
Owner

@slovanos commented on GitHub (Dec 11, 2023):

Thank you very much for the fast response. ollama version 0.1.13. So that may be the issue. Thanks!

<!-- gh-comment-id:1850545787 --> @slovanos commented on GitHub (Dec 11, 2023): Thank you very much for the fast response. ollama version 0.1.13. So that may be the issue. Thanks!
Author
Owner

@ghassett commented on GitHub (Feb 18, 2024):

I see this issue consistently when using the "mistral" model, but all works ok when using the "llama2." My installed version is "ollama version is 0.1.17" running on MacOS, Mac M1. With Ollama serving, I also get a 404 at http://localhost:11434/api/chat when trying to hit that URL in a browser.

<!-- gh-comment-id:1951350865 --> @ghassett commented on GitHub (Feb 18, 2024): I see this issue consistently when using the "mistral" model, but all works ok when using the "llama2." My installed version is "ollama version is 0.1.17" running on MacOS, Mac M1. With Ollama serving, I also get a 404 at http://localhost:11434/api/chat when trying to hit that URL in a browser.
Author
Owner

@boldranet commented on GitHub (Mar 30, 2024):

@ghassett read the error.message, and not just the status line. 404 means you haven't installed that model. use "ollama run mistral" first.

And GET requests, which you're sending in your browser, are routed differently. See also /Ollama/server.log

<!-- gh-comment-id:2028447587 --> @boldranet commented on GitHub (Mar 30, 2024): @ghassett read the error.message, and not just the status line. 404 means you haven't installed that model. use "ollama run mistral" first. And GET requests, which you're sending in your browser, are routed differently. See also /Ollama/server.log
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#47302