mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
[GH-ISSUE #10067] Bug - OpenAI exposed API not compatible with external tools #54418
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Seniorsimo on GitHub (Feb 15, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/10067
Bug Report
Installation Method
Docker
Environment
Confirmation:
Expected Behavior:
When the OpenAI compatible APIs are used with any external tools, they should work as the official one.
In my case i'm using Open WebUI APIs from a langchain agent. Switching from the official OpenAI API to the Open WebUI, should work without any compatibility issues.
Actual Behavior:
The Open WebUI exposed API results in some consecutive problems and doesn't work as expected.
Description
Bug Summary:
After I switch my project from OpenAI to the Open WebUI API, the following subsequent problems arise. I manage to solve them in a local fork and now all seem working in my case, so I open this bug to track all che little changes done to make it works.
Sorry if this will be to verbose, I'll try to explain all as best as I can.
The problems found are, in order:
Usage stats doesn't follow the OpenAI standard
Usage stats exposed by the endpoint
/api/chat/completionsare as followInstead, following the OpenAI standaard, they should be:
This broke any external tool that use the exposed API
usagefield. In my case, langchain broke because noprompt_tokens,completion_tokensandtotal_tokensare found in theusagedictionary.Suggested fix:
Change the usage returned to be as in the OpenAI standard.
d317085dd8ChatMessage validation is incorrect
The ChatMessage validation for the input messages in prompt is not correct. Take this example call's messages
This is actually a valid OpenAI call to do after a tool is used. Open WebUI doesn't allow this because the third message hasn't a
contentfield.Suggested Fix
Change the validator in a way that at least one of
contentortool_callsshould be defined3f8c446f2fMissing support for tool in OpenAI to Ollama message conversion
During a request as the one before, no tools were passed to Ollama, so in the last two messages,
tool_callsandtool_call_idare missing in the Ollama converted messageSuggested Fix
Add support for this kind of messages
0758abc495Ollama can generate a single message in call with stream=true
Last is that making the call above (stream=true), Ollama respond with a single message and the actual conversion from Ollama to OpenAI results in a message with null
content.Suggested Fix
Handle this corner case
c4617bea2eReproduction Details
Steps to Reproduce:
All this flow of errors can be easily reproduced creating a new python file with langchain, and using the standard OpenAI adapter to connect to the Open WebUI APIs. Then creating a simple agent makes me discover all of this.
Code for reference:
Logs and Screenshots
Browser Console Logs:
not applicable
Docker Container Logs:
INFO: 10.88.5.1:0 - "POST /api/chat/completions HTTP/1.1" 200 OK
There is no logs here, every problem found results in a 200 call and sunsequential error in the client due to the varius missing part
Screenshots/Screen Recordings (if applicable):
not applicable
Additional Information
I found all of this tring to integrate a Langchain agent to use Open WebUI, but I suppose that there could be other software that can face the same problems tring to use Open WebUI as a OpenAI compatible API.
Exposing a compatible OpenAI API is a great choice for me! Many other people may use this fantastic free tool as a backend for their project, so from my point of view making the exposed API 100% compatible with the official one is a critical choice!
Let me know if somethings that i reported seems wrong to you.
I'm happy to help in fixing this all, not only for my project, but possibly for others too.