BUG: issue with OpenWebUI Ollama proxy API endpoint for Embeddings #2039

Closed
opened 2025-11-11 14:59:11 -06:00 by GiteaMirror · 2 comments
Owner

Originally created by @AndiMajore on GitHub (Sep 9, 2024).

Bug Report

Installation Method

Docker

Environment

  • Open WebUI Version: v0.3.21 (compared to v0.3.19)

  • Ollama: v0.3.10

  • Operating System: Ubuntu (but irrelevant)

  • Browser (if applicable): Chrome (but irrelevant, Issue is with API not UI)

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs. (Not applicable)
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

Langchains OllamaEmbeddings class should work the same for http://$OPENWEBUI/ollama and http://$OLLAMA

Actual Behavior:

Langchains OllamaEmbeddings class works on http://$OLLAMA but crashes for http://$OPENWEBUI/ollama

Description

Bug Summary:
I am using langchain to work with LLMs.

langchain==0.2.16
langchain-community==0.2.16
langchain-core==0.2.38
langchain-ollama==0.1.3
langchain-text-splitters==0.2.4

Until now I could use the JWT of my OpenWebUI user, add the authorization info in the header field and use OllamaEmbeddings and Ollama classes to query the LLM. Until OpenWebUI version v0.3.19 using OllamaEmbeddings with the OpenWebUI endpoint to proxy requests to Ollama did work. Now in version v0.3.21 it returnes a response that cannot be parsed, because the response has an unexpected structure.

Reproduction Details

Steps to Reproduce:

  • Spin up a Ollama and OpenWebUI docker instance of the provided versions
  • Run the following python code once for each endpoint
from langchain_community.embeddings.ollama import OllamaEmbeddings
import requests, json

#Your OpenWebUI-User JWT
jwt="XXX"
model="llama3.1:latest"


print("###Ollama###")
ollama_api_url="http://$OLLAMA/"
ollama_embedder = OllamaEmbeddings(base_url=ollama_api_url, model=model})
embeddings = embedder.embed_documents(["Alzheimers", "Diabetes Type 2"])
for e in embeddings:
   print(e)

print("###OpenWebUI###")
openwebui_api_url="http://$OPENWEBUI/ollama
openwebui_embedder = OllamaEmbeddings(base_url=openwebui_api_url, model=model})
embeddings = embedder.embed_documents(["Alzheimers", "Diabetes Type 2"])
for e in embeddings:
   print(e)

Logs and Screenshots

image

Docker Container Logs:

INFO:     134.100.17.201:0 - "POST /api/v1/auths/signin HTTP/1.1" 200 OK
INFO  [open_webui.apps.ollama.main] url: https://llm.cosy.bio
INFO:     134.100.17.201:0 - "POST /ollama/api/embeddings HTTP/1.1" 200 OK
Originally created by @AndiMajore on GitHub (Sep 9, 2024). # Bug Report ## Installation Method Docker ## Environment - **Open WebUI Version:** v0.3.21 (compared to v0.3.19) - **Ollama**: v0.3.10 - **Operating System:** Ubuntu (but irrelevant) - **Browser (if applicable):** Chrome (but irrelevant, Issue is with API not UI) **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. (Not applicable) - [x] I have included the Docker container logs. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: Langchains OllamaEmbeddings class should work the same for http://$OPENWEBUI/ollama and http://$OLLAMA ## Actual Behavior: Langchains OllamaEmbeddings class works on http://$OLLAMA but crashes for http://$OPENWEBUI/ollama ## Description **Bug Summary:** I am using langchain to work with LLMs. ```bash langchain==0.2.16 langchain-community==0.2.16 langchain-core==0.2.38 langchain-ollama==0.1.3 langchain-text-splitters==0.2.4 ``` Until now I could use the JWT of my OpenWebUI user, add the authorization info in the header field and use OllamaEmbeddings and Ollama classes to query the LLM. Until OpenWebUI version v0.3.19 using OllamaEmbeddings with the OpenWebUI endpoint to proxy requests to Ollama did work. Now in version v0.3.21 it returnes a response that cannot be parsed, because the response has an unexpected structure. ## Reproduction Details **Steps to Reproduce:** - Spin up a Ollama and OpenWebUI docker instance of the provided versions - Run the following python code once for each endpoint ```python from langchain_community.embeddings.ollama import OllamaEmbeddings import requests, json #Your OpenWebUI-User JWT jwt="XXX" model="llama3.1:latest" print("###Ollama###") ollama_api_url="http://$OLLAMA/" ollama_embedder = OllamaEmbeddings(base_url=ollama_api_url, model=model}) embeddings = embedder.embed_documents(["Alzheimers", "Diabetes Type 2"]) for e in embeddings: print(e) print("###OpenWebUI###") openwebui_api_url="http://$OPENWEBUI/ollama openwebui_embedder = OllamaEmbeddings(base_url=openwebui_api_url, model=model}) embeddings = embedder.embed_documents(["Alzheimers", "Diabetes Type 2"]) for e in embeddings: print(e) ``` ## Logs and Screenshots ![image](https://github.com/user-attachments/assets/2b5efdbd-7261-4052-8f51-961e53f1cbca) **Docker Container Logs:** ```log INFO: 134.100.17.201:0 - "POST /api/v1/auths/signin HTTP/1.1" 200 OK INFO [open_webui.apps.ollama.main] url: https://llm.cosy.bio INFO: 134.100.17.201:0 - "POST /ollama/api/embeddings HTTP/1.1" 200 OK ```
Author
Owner

@tjbck commented on GitHub (Sep 9, 2024):

Fixed on dev.

@tjbck commented on GitHub (Sep 9, 2024): Fixed on dev.
Author
Owner

@AndiMajore commented on GitHub (Sep 10, 2024):

Thank you!

@AndiMajore commented on GitHub (Sep 10, 2024): Thank you!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#2039