[GH-ISSUE #471] Openai API endpoint location #50742

Closed
opened 2026-05-05 11:03:28 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @matbeedotcom on GitHub (Jan 14, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/471

Trying to use this ollama-webui as an openai endpoint, but it doesnt seem to work- trying to use continue.dev with ollama

Originally created by @matbeedotcom on GitHub (Jan 14, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/471 Trying to use this ollama-webui as an openai endpoint, but it doesnt seem to work- trying to use continue.dev with ollama
Author
Owner

@justinh-rahb commented on GitHub (Jan 14, 2024):

Hi @matbee-eth, I understand that you're looking for a way to use Ollama's functionality with an OpenAI-compatible API endpoint. However, the Ollama WebUI project is separate from Ollama and neither offer this capability. Instead, I would recommend checking out alternative projects like LiteLLM+Ollama or LocalAI for accessing local models via an OpenAI-compatible API.

Additionally, based on the continue.dev documentation, it seems that it can directly work with Ollama's API without requiring an OpenAI-compatible endpoint. So, you may want to explore this option as well.
https://continue.dev/docs/reference/Model%20Providers/ollama

<!-- gh-comment-id:1890848481 --> @justinh-rahb commented on GitHub (Jan 14, 2024): Hi @matbee-eth, I understand that you're looking for a way to use Ollama's functionality with an OpenAI-compatible API endpoint. However, the Ollama WebUI project is separate from Ollama and neither offer this capability. Instead, I would recommend checking out alternative projects like [LiteLLM](https://github.com/BerriAI/litellm)+Ollama or [LocalAI](https://github.com/mudler/LocalAI) for accessing local models via an OpenAI-compatible API. Additionally, based on the continue.dev documentation, it seems that it can directly work with Ollama's API without requiring an OpenAI-compatible endpoint. So, you may want to explore this option as well. https://continue.dev/docs/reference/Model%20Providers/ollama
Author
Owner

@matbeedotcom commented on GitHub (Jan 14, 2024):

Yeah I use it locally as well, is there an api key system or a way to disable auth for localhost?

<!-- gh-comment-id:1890851485 --> @matbeedotcom commented on GitHub (Jan 14, 2024): Yeah I use it locally as well, is there an api key system or a way to disable auth for localhost?
Author
Owner

@justinh-rahb commented on GitHub (Jan 14, 2024):

To be clear, Ollama WebUI does not provide an API intended for use by external applications, and the author has stated previously that an option for disabling authentication isn't a goal of this particular project. If you're looking for a lighter-weight version of the application for personal local usage, you can check out Ollama WebUI Lite.

If you're looking for a way to provide OpenAI API and manage API keys for Ollama, LiteLLM would be ideal. I recommend reading their documentation for a thorough understanding of its capabilities.

<!-- gh-comment-id:1890853701 --> @justinh-rahb commented on GitHub (Jan 14, 2024): To be clear, Ollama WebUI does not provide an API intended for use by external applications, and the author has stated previously that an option for disabling authentication isn't a goal of this particular project. If you're looking for a lighter-weight version of the application for personal local usage, you can check out [Ollama WebUI Lite](https://github.com/ollama-webui/ollama-webui-lite). If you're looking for a way to provide OpenAI API and manage API keys for Ollama, LiteLLM would be ideal. I recommend reading their documentation for a thorough understanding of its capabilities.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#50742