[GH-ISSUE #5294] Support OPTIONS with OpenAI endpoints #29079

Open
opened 2026-04-22 07:43:28 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @windkwbs on GitHub (Jun 26, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5294

Originally assigned to: @ParthSareen on GitHub.

微信截图_20240626152830

Originally created by @windkwbs on GitHub (Jun 26, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5294 Originally assigned to: @ParthSareen on GitHub. ![微信截图_20240626152830](https://github.com/ollama/ollama/assets/129468439/23e316b8-cb87-4783-81af-96b94690a61a)
GiteaMirror added the feature requestapi labels 2026-04-22 07:43:29 -05:00
Author
Owner

@d-kleine commented on GitHub (Jun 26, 2024):

Have you tried with http://localhost:11434 like here: https://ollama.com/blog/openai-compatibility

<!-- gh-comment-id:2192047615 --> @d-kleine commented on GitHub (Jun 26, 2024): Have you tried with `http://localhost:11434` like here: https://ollama.com/blog/openai-compatibility
Author
Owner

@windkwbs commented on GitHub (Jun 27, 2024):

It still doesn't work. Does this mean Windows systems can only use POST and not support OPTIONS?

<!-- gh-comment-id:2194282268 --> @windkwbs commented on GitHub (Jun 27, 2024): It still doesn't work. Does this mean Windows systems can only use POST and not support OPTIONS?
Author
Owner

@d-kleine commented on GitHub (Jun 27, 2024):

No, should work with Windows too (HTTP methods should be OS-agnostic), must be a different problem. Maybe your port is blocked by a firewall or AV. You could try debugging with Postman (there is a OPTIONS method).

<!-- gh-comment-id:2194309908 --> @d-kleine commented on GitHub (Jun 27, 2024): No, should work with Windows too (HTTP methods should be OS-agnostic), must be a different problem. Maybe your port is blocked by a firewall or AV. You could try debugging with Postman (there is a OPTIONS method).
Author
Owner

@d-kleine commented on GitHub (Jun 29, 2024):

I just have take a look into my console when doing inference with Ollama, it's only sending POST requests to "api/chat"

<!-- gh-comment-id:2198258053 --> @d-kleine commented on GitHub (Jun 29, 2024): I just have take a look into my console when doing inference with Ollama, it's only sending POST requests to "api/chat"
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29079