Support Ollama's OpenAI Compatible APIs #392

Closed
opened 2025-11-11 14:20:02 -06:00 by GiteaMirror · 5 comments
Owner

Originally created by @wlh320 on GitHub (Mar 2, 2024).

Is your feature request related to a problem? Please describe.

According to https://ollama.com/blog/openai-compatibility and the code
21347e1ed6/server/routes.go (L961-L962)

Ollama's OpenAI Compatible APIs do not have '/api' prefix.
However, In current implentation of open-webui, OLLAMA_API_BASE_URL should have '/api' suffix.
If users want to communicate with Ollama's API through open-webui,
then they can't use Ollama's OpenAI Compatible APIs and use open-webui in browser at the same time.

Describe the solution you'd like

Are there any ways to support both Ollama's own APIs and Ollama's OpenAI Compatible APIs ?
Maybe adding a new route /ollama/api/v1/{path} and manually handling the OLLAMA_API_BASE_URL in the handler can solve this problem?

Originally created by @wlh320 on GitHub (Mar 2, 2024). **Is your feature request related to a problem? Please describe.** According to https://ollama.com/blog/openai-compatibility and the code https://github.com/ollama/ollama/blob/21347e1ed67a4ef36abc11bf314e90eaba9a0dc4/server/routes.go#L961-L962 Ollama's OpenAI Compatible APIs do not have '/api' prefix. However, In current implentation of open-webui, `OLLAMA_API_BASE_URL` should have '/api' suffix. If users want to communicate with Ollama's API through open-webui, then they can't use Ollama's OpenAI Compatible APIs and use open-webui in browser at the same time. **Describe the solution you'd like** Are there any ways to support both Ollama's own APIs and Ollama's OpenAI Compatible APIs ? Maybe adding a new route `/ollama/api/v1/{path}` and manually handling the `OLLAMA_API_BASE_URL` in the handler can solve this problem?
Author
Owner

@justinh-rahb commented on GitHub (Mar 2, 2024):

Why? The OpenAI compatible endpoint does not return the same information the native one does, so it's a degraded experience if used in Open WebUI. Besides, you can already add it as an "OpenAI" connection or LiteLLM "model". This request is redundant.

@justinh-rahb commented on GitHub (Mar 2, 2024): Why? The OpenAI compatible endpoint does not return the same information the native one does, so it's a degraded experience if used in Open WebUI. Besides, you can already add it as an "OpenAI" connection or LiteLLM "model". This request is redundant.
Author
Owner

@wlh320 commented on GitHub (Mar 2, 2024):

Open WebUI provides security benefits for Ollama APIs, so some users would like to use other clients such as a desktop client which are compatible with OpenAI's API, and fill in open-webui's url to use Ollama, and some other users would like to use Open WebUI directly from the browser. In this condition, these two requirements cannot be met at the same time.

@wlh320 commented on GitHub (Mar 2, 2024): Open WebUI provides security benefits for Ollama APIs, so some users would like to use other clients such as a desktop client which are compatible with OpenAI's API, and fill in open-webui's url to use Ollama, and some other users would like to use Open WebUI directly from the browser. In this condition, these two requirements cannot be met at the same time.
Author
Owner

@wlh320 commented on GitHub (Mar 2, 2024):

Maybe the first way is a wrong way to use Open WebUI because it only uses its exposed APIs.

@wlh320 commented on GitHub (Mar 2, 2024): Maybe the first way is a wrong way to use Open WebUI because it only uses its exposed APIs.
Author
Owner

@justinh-rahb commented on GitHub (Mar 2, 2024):

Maybe the first way is a wrong way to use Open WebUI because it only uses its exposed APIs.

It certainly doesn't sound like an intended use-case. If you want authentication for Ollama API there are other ways to do it, you could front it with a reverse-proxy that adds an authentication mechanism. I don't think there's anything that can be done to service this need from WebUI's point of view though.

@justinh-rahb commented on GitHub (Mar 2, 2024): > Maybe the first way is a wrong way to use Open WebUI because it only uses its exposed APIs. It certainly doesn't sound like an _intended_ use-case. If you want authentication for Ollama API there are other ways to do it, you could front it with a reverse-proxy that adds an authentication mechanism. I don't think there's anything that can be done to service this need from WebUI's point of view though.
Author
Owner

@wlh320 commented on GitHub (Mar 2, 2024):

Thank you! I thought it is intended. I am sorry for misunderstanding this.

@wlh320 commented on GitHub (Mar 2, 2024): Thank you! I thought it is intended. I am sorry for misunderstanding this.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#392