[GH-ISSUE #1768] The API - http://127.0.0.1:11434/api doesn't work. #26774

Closed
opened 2026-04-22 03:22:14 -05:00 by GiteaMirror · 9 comments
Owner

Originally created by @PriyaranjanMaratheDish on GitHub (Jan 3, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/1768

1)The API - http://127.0.0.1:11434/api doesn't work. Are there any additional steps for http://127.0.0.1:11434/api to work correctly?

Doesn't work on my mac and EC2 as well.

Originally created by @PriyaranjanMaratheDish on GitHub (Jan 3, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/1768 1)The API - http://127.0.0.1:11434/api doesn't work. Are there any additional steps for http://127.0.0.1:11434/api to work correctly? Doesn't work on my mac and EC2 as well.
Author
Owner

@igorschlum commented on GitHub (Jan 3, 2024):

@PriyaranjanMaratheDish when you type http://127.0.0.1:11434, do you see "Ollama is running" in the browser?
When you click on http://127.0.0.1:11434/api, do you see "404 page not found" in your browser?
API is not a page where you can interact with Ollama. After installing the Ollama app, you need to double-click on it to launch Ollama, then open the terminal to load LLM and chat with them.
You can also use an app like Chatbox that will give you a web interface if you intend to use Ollama in a web browser.
Let me know if it works for you.

<!-- gh-comment-id:1875867197 --> @igorschlum commented on GitHub (Jan 3, 2024): @PriyaranjanMaratheDish when you type http://127.0.0.1:11434, do you see "Ollama is running" in the browser? When you click on http://127.0.0.1:11434/api, do you see "404 page not found" in your browser? API is not a page where you can interact with Ollama. After installing the Ollama app, you need to double-click on it to launch Ollama, then open the terminal to load LLM and chat with them. You can also use an app like Chatbox that will give you a web interface if you intend to use Ollama in a web browser. Let me know if it works for you.
Author
Owner

@technovangelist commented on GitHub (Jan 3, 2024):

hi @PriyaranjanMaratheDish , thanks for submitting this. /api isn't a valid endpoint. You should see a response on / or a POST to /api/generate. Is there any documentation anywhere you have seen that points to /api? We would like to make sure its fixed. Thanks for being a great part of this community.

<!-- gh-comment-id:1875875272 --> @technovangelist commented on GitHub (Jan 3, 2024): hi @PriyaranjanMaratheDish , thanks for submitting this. `/api` isn't a valid endpoint. You should see a response on `/` or a POST to `/api/generate`. Is there any documentation anywhere you have seen that points to `/api`? We would like to make sure its fixed. Thanks for being a great part of this community.
Author
Owner

@PriyaranjanMaratheDish commented on GitHub (Jan 3, 2024):

hi @technovangelist - see this link - https://github.com/ollama-webui/ollama-webui/blob/main/TROUBLESHOOTING.md the step they mention are -

docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main

<!-- gh-comment-id:1875901415 --> @PriyaranjanMaratheDish commented on GitHub (Jan 3, 2024): hi @technovangelist - see this link - https://github.com/ollama-webui/ollama-webui/blob/main/TROUBLESHOOTING.md the step they mention are - docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
Author
Owner

@PriyaranjanMaratheDish commented on GitHub (Jan 3, 2024):

hi @technovangelist @igorschlum - curl to /api/generate fails too see below

curl -v http://127.0.0.1:11434/api/generate

  • Trying 127.0.0.1:11434...
  • Connected to 127.0.0.1 (127.0.0.1) port 11434 (#0)

GET /api/generate HTTP/1.1
Host: 127.0.0.1:11434
User-Agent: curl/7.81.0
Accept: /

  • Mark bundle as not supporting multiuse
    < HTTP/1.1 404 Not Found
    < Content-Type: text/plain
    < Date: Wed, 03 Jan 2024 20:04:22 GMT
    < Content-Length: 18
    <
  • Connection #0 to host 127.0.0.1 left intact
<!-- gh-comment-id:1875903528 --> @PriyaranjanMaratheDish commented on GitHub (Jan 3, 2024): hi @technovangelist @igorschlum - curl to /api/generate fails too see below curl -v http://127.0.0.1:11434/api/generate * Trying 127.0.0.1:11434... * Connected to 127.0.0.1 (127.0.0.1) port 11434 (#0) > GET /api/generate HTTP/1.1 > Host: 127.0.0.1:11434 > User-Agent: curl/7.81.0 > Accept: */* > * Mark bundle as not supporting multiuse < HTTP/1.1 404 Not Found < Content-Type: text/plain < Date: Wed, 03 Jan 2024 20:04:22 GMT < Content-Length: 18 < * Connection #0 to host 127.0.0.1 left intact
Author
Owner

@technovangelist commented on GitHub (Jan 3, 2024):

hi @PriyaranjanMaratheDish that doesn't look like a complete call to that api. Looks like a GET instead of a POST. You can find the docs for the Ollama API here: https://github.com/jmorganca/ollama/blob/main/docs/api.md. The example for /api/generate is:

curl http://localhost:11434/api/generate -d '{
  "model": "llama2",
  "prompt": "Why is the sky blue?"
}'
<!-- gh-comment-id:1876119911 --> @technovangelist commented on GitHub (Jan 3, 2024): hi @PriyaranjanMaratheDish that doesn't look like a complete call to that api. Looks like a GET instead of a POST. You can find the docs for the Ollama API here: https://github.com/jmorganca/ollama/blob/main/docs/api.md. The example for `/api/generate` is: ``` curl http://localhost:11434/api/generate -d '{ "model": "llama2", "prompt": "Why is the sky blue?" }' ```
Author
Owner

@PriyaranjanMaratheDish commented on GitHub (Jan 4, 2024):

Thanks @technovangelist

<!-- gh-comment-id:1877522909 --> @PriyaranjanMaratheDish commented on GitHub (Jan 4, 2024): Thanks @technovangelist
Author
Owner

@technovangelist commented on GitHub (Jan 4, 2024):

hi @PriyaranjanMaratheDish I think that means the issue was solved and that I can close this now. If you run into any more problems you can reopen this issue, create a new one, or check us out on discord at https://discord.gg/ollama.

<!-- gh-comment-id:1877662812 --> @technovangelist commented on GitHub (Jan 4, 2024): hi @PriyaranjanMaratheDish I think that means the issue was solved and that I can close this now. If you run into any more problems you can reopen this issue, create a new one, or check us out on discord at https://discord.gg/ollama.
Author
Owner

@zhu-peiqi commented on GitHub (Jul 17, 2024):

I try it on the CMD, it can't run. But, I try it on the Git, it can run.

<!-- gh-comment-id:2233023292 --> @zhu-peiqi commented on GitHub (Jul 17, 2024): I try it on the CMD, it can't run. But, I try it on the Git, it can run.
Author
Owner

@louiewh commented on GitHub (Feb 6, 2025):

Image

➜ AndroidStudioProjects curl http://127.0.0.1:11434
Ollama is running%

Image

how to config Chatbox

<!-- gh-comment-id:2639806699 --> @louiewh commented on GitHub (Feb 6, 2025): <img width="955" alt="Image" src="https://github.com/user-attachments/assets/9eddd22e-39a3-4c44-9c64-d9ead938ca9c" /> ➜ AndroidStudioProjects curl http://127.0.0.1:11434 Ollama is running% <img width="1000" alt="Image" src="https://github.com/user-attachments/assets/cba77345-7473-4189-bcaa-515a0387a038" /> how to config Chatbox
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26774