mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 19:38:46 -05:00
Confused about API support #4182
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @stefanoco on GitHub (Feb 27, 2025).
I'm confused about API support, due also to the poor documentation on this part. WebUI is otherwise a great application that's really changing the use of LLM in our organization at the foundations.
Now we need to have an OpenAI-compatible endpoint with user-managed keys etc. and WebUI should be perfect for this, but while trying to connect a couple of Obsidian LLM plugin ("AI Providers" and "Local GPT") to our locally installed WebUI+Ollama stack I can't go past the models selection. I can see the list of models available in our WebUI instance, but then looks like the rest of the API endpoints are broken somehow (not really compatible with OpenAI).
The API endpoint I'm configuring is: https://mywebui.local/api
Any hint?
@lwsrbrts commented on GitHub (Feb 27, 2025):
I can certainly see where you're coming from having struggled with which endpoint
base_urlto use when configuring things like PandasAI and LangChain....I think ultimately it's unknown how your Obsidian plugins are going to try and address the API endpoints so it might be wise to check out the logs coming out of OWUI. Also check if the Obsidian plugins can accept an API key because if they think they're just going to hit Ollama directly, they may not.
Here's a couple more to try...
https://owui.domain.com/ollama
https://owui.domain.com/v1
https://owui.domain.com/api/v1
No need to read on, this is just explaining what I saw when trying to get a LangChain script to work via OWUI...
For example, a basic LangChain script for talking to a CSV which needs to use the
ChatOpenAIclass when using a non-local LLM. In this case the modelgoogle/gemini-2.0-flash-001is an OpenRouter.ai model I added to OWUI.And this one attempts to use the
ChatOllamaclass to do the same thing but won't work due to the way the class attempts to tag on:l;atestto the model name. It technically does work with fully local Ollama models with a:in their name though, hence my suggestions above.