mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
Openai API endpoint location #181
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @matbeedotcom on GitHub (Jan 14, 2024).
Trying to use this ollama-webui as an openai endpoint, but it doesnt seem to work- trying to use continue.dev with ollama
@justinh-rahb commented on GitHub (Jan 14, 2024):
Hi @matbee-eth, I understand that you're looking for a way to use Ollama's functionality with an OpenAI-compatible API endpoint. However, the Ollama WebUI project is separate from Ollama and neither offer this capability. Instead, I would recommend checking out alternative projects like LiteLLM+Ollama or LocalAI for accessing local models via an OpenAI-compatible API.
Additionally, based on the continue.dev documentation, it seems that it can directly work with Ollama's API without requiring an OpenAI-compatible endpoint. So, you may want to explore this option as well.
https://continue.dev/docs/reference/Model%20Providers/ollama
@matbeedotcom commented on GitHub (Jan 14, 2024):
Yeah I use it locally as well, is there an api key system or a way to disable auth for localhost?
@justinh-rahb commented on GitHub (Jan 14, 2024):
To be clear, Ollama WebUI does not provide an API intended for use by external applications, and the author has stated previously that an option for disabling authentication isn't a goal of this particular project. If you're looking for a lighter-weight version of the application for personal local usage, you can check out Ollama WebUI Lite.
If you're looking for a way to provide OpenAI API and manage API keys for Ollama, LiteLLM would be ideal. I recommend reading their documentation for a thorough understanding of its capabilities.