mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #7726] Streaming content from OpenAI-compatible API appears to be broken #30391
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Gordonei on GitHub (Dec 9, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/7726
Bug Report
Firstly, thanks for this awesome project! I suspect this is maybe related to #7183, but I've tried to put a bit more effort into the bug report.
Installation Method
Helm chart deploying open-webui to k8s cluster. Using traefik ingress.
Environment
v0.4.8Ubuntu 20.04Confirmation:
and Ollama(not using Ollama).Expected Behavior:
When chatting via the WebUI, content to be streamed out.
Actual Behavior:
Chat isn't displayed, and there is an error message:
Description
Bug Summary:
It appears there is something wrong in handling streamed content from OpenAI-like endpoints
Reproduction Details
Steps to Reproduce:
WebUI
API
Contents of
llama3.2-simple-test.json:Relevant bit of response:
(i.e. empty response)
vs
Same, but with
streamparameter set to false, i.e.response:
vs
Calling
llama-cpp-pythondirectly, with"stream": trueLogs and Screenshots
Browser Console Logs:
Error that appears in console:
Docker Container Logs:
Server logs appear to indicate that call to API was successful:
Screenshots/Screen Recordings (if applicable):
Additional Information
Model is being served up by
llama-cpp-python, v0.3.2 - can provide more details on this config.Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!