Confused about API support #4182

Closed
opened 2025-11-11 15:48:21 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @stefanoco on GitHub (Feb 27, 2025).

I'm confused about API support, due also to the poor documentation on this part. WebUI is otherwise a great application that's really changing the use of LLM in our organization at the foundations.

Now we need to have an OpenAI-compatible endpoint with user-managed keys etc. and WebUI should be perfect for this, but while trying to connect a couple of Obsidian LLM plugin ("AI Providers" and "Local GPT") to our locally installed WebUI+Ollama stack I can't go past the models selection. I can see the list of models available in our WebUI instance, but then looks like the rest of the API endpoints are broken somehow (not really compatible with OpenAI).

The API endpoint I'm configuring is: https://mywebui.local/api
Any hint?

Originally created by @stefanoco on GitHub (Feb 27, 2025). I'm confused about API support, due also to the poor documentation on this part. WebUI is otherwise a *great* application that's really changing the use of LLM in our organization at the foundations. Now we need to have an OpenAI-compatible endpoint with user-managed keys etc. and WebUI should be perfect for this, but while trying to connect a couple of Obsidian LLM plugin ("AI Providers" and "Local GPT") to our locally installed WebUI+Ollama stack I can't go past the models selection. I can see the list of models available in our WebUI instance, but then looks like the rest of the API endpoints are broken somehow (not really compatible with OpenAI). The API endpoint I'm configuring is: https://mywebui.local/api Any hint?
Author
Owner

@lwsrbrts commented on GitHub (Feb 27, 2025):

I can certainly see where you're coming from having struggled with which endpoint base_url to use when configuring things like PandasAI and LangChain....

I think ultimately it's unknown how your Obsidian plugins are going to try and address the API endpoints so it might be wise to check out the logs coming out of OWUI. Also check if the Obsidian plugins can accept an API key because if they think they're just going to hit Ollama directly, they may not.

Here's a couple more to try...

https://owui.domain.com/ollama
https://owui.domain.com/v1
https://owui.domain.com/api/v1

No need to read on, this is just explaining what I saw when trying to get a LangChain script to work via OWUI...

For example, a basic LangChain script for talking to a CSV which needs to use the ChatOpenAI class when using a non-local LLM. In this case the model google/gemini-2.0-flash-001 is an OpenRouter.ai model I added to OWUI.

import os
from langchain_experimental.agents import create_csv_agent
from langchain_openai import ChatOpenAI
from dotenv import load_dotenv
import langchain
import pandas as pd

#langchain.debug = True

# Load environment variables from a .env file
load_dotenv()

# Retrieve the API key from environment variables
openwebui_api_key = os.getenv("OPENWEBUI_API_KEY")

csv_path = "data.csv"

llm = ChatOpenAI(
    model_name="google/gemini-2.0-flash-001",  # Or any other model available on this Open WebUI instance.
    temperature=0.5,
    openai_api_key=openwebui_api_key,  # Replace with your Open WebUI API key.
    openai_api_base="https://owui.domain.com/api", # The base URL of your Open WebUI instance.
)

agent = create_csv_agent(
    llm=llm,
    path=csv_path,
    verbose=False, # Make verbose=True to see the generated code
    allow_dangerous_code=True # Clearly this is a warning that the code generated by the LLM is being run on this machine.
)

question = "Show me the total of column1 and column2"

response = agent.invoke(question)

print(response['output']) 

And this one attempts to use the ChatOllama class to do the same thing but won't work due to the way the class attempts to tag on :l;atest to the model name. It technically does work with fully local Ollama models with a : in their name though, hence my suggestions above.

import os
from langchain_ollama.chat_models import ChatOllama
from langchain_experimental.agents import create_csv_agent

from dotenv import load_dotenv

import langchain
#langchain.debug = True

import pandas as pd

# Load environment variables from a .env file
load_dotenv()

# Retrieve the API key from environment variables
openwebui_api_key = os.getenv("OPENWEBUI_API_KEY")

csv_path = "data.csv"

llm = ChatOllama(
    #model="llama3.1:8b", # Works
    #model="qwen2.5-coder:1.5b-base", # Works
    model="google/gemini-2.0-flash-lite-001", # This won't work because ":latest" gets tagged on by this class. Must use ChatOpenAI
    temperature=0.5,
    base_url="http://owui.domain.com/ollama",
    client_kwargs={
        "headers": {
            "Authorization": f"Bearer {openwebui_api_key}"
        }
    }
    # other params...
)

agent = create_csv_agent(
    llm=llm,
    path=csv_path,
    verbose=False, # Make verbose=True to see the generated code
    allow_dangerous_code=True # Clearly this is a warning that the code generated by the LLM is being run on this machine.
)


question = "Show me the total of column1 and column2"

response = agent.invoke(question)

print(response['output']) 

@lwsrbrts commented on GitHub (Feb 27, 2025): I can certainly see where you're coming from having struggled with which endpoint `base_url` to use when configuring things like PandasAI and LangChain.... I think ultimately it's unknown how your Obsidian plugins are going to try and address the API endpoints so it might be wise to check out the logs coming out of OWUI. Also check if the Obsidian plugins can accept an API key because if they think they're just going to hit Ollama directly, they may not. Here's a couple more to try... https://owui.domain.com/ollama https://owui.domain.com/v1 https://owui.domain.com/api/v1 No need to read on, this is just explaining what I saw when trying to get a LangChain script to work via OWUI... For example, a basic LangChain script for talking to a CSV which needs to use the `ChatOpenAI` class when using a non-local LLM. In this case the model `google/gemini-2.0-flash-001` is an OpenRouter.ai model I added to OWUI. ```python import os from langchain_experimental.agents import create_csv_agent from langchain_openai import ChatOpenAI from dotenv import load_dotenv import langchain import pandas as pd #langchain.debug = True # Load environment variables from a .env file load_dotenv() # Retrieve the API key from environment variables openwebui_api_key = os.getenv("OPENWEBUI_API_KEY") csv_path = "data.csv" llm = ChatOpenAI( model_name="google/gemini-2.0-flash-001", # Or any other model available on this Open WebUI instance. temperature=0.5, openai_api_key=openwebui_api_key, # Replace with your Open WebUI API key. openai_api_base="https://owui.domain.com/api", # The base URL of your Open WebUI instance. ) agent = create_csv_agent( llm=llm, path=csv_path, verbose=False, # Make verbose=True to see the generated code allow_dangerous_code=True # Clearly this is a warning that the code generated by the LLM is being run on this machine. ) question = "Show me the total of column1 and column2" response = agent.invoke(question) print(response['output']) ``` And this one attempts to use the `ChatOllama` class to do the same thing but won't work due to the way the class attempts to tag on `:l;atest` to the model name. It technically does work with fully local Ollama models with a `:` in their name though, hence my suggestions above. ```python import os from langchain_ollama.chat_models import ChatOllama from langchain_experimental.agents import create_csv_agent from dotenv import load_dotenv import langchain #langchain.debug = True import pandas as pd # Load environment variables from a .env file load_dotenv() # Retrieve the API key from environment variables openwebui_api_key = os.getenv("OPENWEBUI_API_KEY") csv_path = "data.csv" llm = ChatOllama( #model="llama3.1:8b", # Works #model="qwen2.5-coder:1.5b-base", # Works model="google/gemini-2.0-flash-lite-001", # This won't work because ":latest" gets tagged on by this class. Must use ChatOpenAI temperature=0.5, base_url="http://owui.domain.com/ollama", client_kwargs={ "headers": { "Authorization": f"Bearer {openwebui_api_key}" } } # other params... ) agent = create_csv_agent( llm=llm, path=csv_path, verbose=False, # Make verbose=True to see the generated code allow_dangerous_code=True # Clearly this is a warning that the code generated by the LLM is being run on this machine. ) question = "Show me the total of column1 and column2" response = agent.invoke(question) print(response['output']) ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#4182