mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-22 14:13:08 -05:00
bug: OpenAI API connection doesn't accept mimetype: application/x-ndjson ? #2602
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @fabigr8 on GitHub (Nov 9, 2024).
Bug Report
Installation Method
Docker
Environment
Confirmation:
Expected Behavior:
produce a chat-output in web-ui
Actual Behavior:
Application produces no output.
Instead the loading image persists ( lines that represent an chat and get replaced with actual text after the model responses).
Description
Bug Summary:
I am using litserve to host models on a separate server using LitServe's
OpenAI-specwithStreaming=True.I ran into an issue that is verry similar to #4915 (but for the openAI API).
Open WebUI seem to have an issue when using OpenAI-connection, with model setting: streaming on and when its getting back
application/x-ndjsonfrom the served model (See the Docker Logs below).Due to this, the chat can't process the model response and views no model answers.
Additionally, I tested LitServe with a minimalistic python client script using the openAI package and this is producing the correct output and processing the x-ndjson` responses from LitServe correctly (therefore I can eliminate an error on LitServe's side).
You find all scripts (LitServe example and Client-test) below.
Reproduction Details
Steps to Reproduce:
pip install litserveand other packages you may miss.http://localhost:9000/v1Logs and Screenshots
Browser Console Logs:
Docker Container Logs:
Screenshots/Screen Recordings (if applicable):
not applicable
Additional Information
Here is a minimalist LitServe example w/o LLM that should send back and static text.
running this script will run a LitServe server on port:9000
If you run this on the same server as open web-ui you need to add
http://localhost:9000/v1with any api-key (not relevant) in the admin-setting, to connect to LitServe.Also be sure that open web-ui docker runs with
--network=hostLitServe-Code:
Additionally here is the minimalistic code to test litserve with openAI client lib: