mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-08 04:16:03 -05:00
[GH-ISSUE #13123] issue: Streaming output of RAGFlow only works in frontend "TypeScript", but doesn't work in backend "Routers" #32345
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @AlexRice13 on GitHub (Apr 22, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/13123
Check Existing Issues
Installation Method
Docker
Open WebUI Version
v0.6.5
Ollama Version (if applicable)
No response
Operating System
Windows 11
Browser (if applicable)
Chrome 134.0.6998.166
Confirmation
README.md.Expected Behavior
RAGFlow is a RAG framework that provides OpenAI compatible API access for chat sessions, therefore I tried access it through 'admin/setting/external_connections' of OpenWebUI.
Hoping RAGFlow response requests of OpenWebUI in streaming way.
Actual Behavior
RAGFlow doesn't response OpenWebUI in streaming way, but in a completed content, causing gigantic response latency, which is far out of people's patient.
I have tried switching the "direct_connection" toggle on and off in admin settings, set the parameters streaming to True, but neither of them works.
However, I tried another external_connections in UserSetting "src/lib/apis/openai/index.ts"(said need correct CORS), the streaming output just works.
Steps to Reproduce
1.Set up a RAGFlow OpenAI compatible API
2.Set an external_connections in admin/setting/, fill api endpoint url with "http://ragflow_url/api/v1/chats_openai/c0161ad007d111f08ad10242ac120006", fill API key.
3.Add random model name, because RAGFlow will resolve it automatically.
5.Enable direct connection under admin/setting/external_connections.
6.Set streaming to True in model parameters.
7.Start a chat, and you won't get streaming response.
8.Try same in 'user/settings' which is completely handled by frontend "src/lib/apis/openai/index.ts".
9.It just works.
Logs & Screenshots
Expected way:


Actual Behavior:

How the stream one works:

Additional Information
Using frontend way it works, but I can't share it with users other than admin.
@tth37 commented on GitHub (Apr 22, 2025):
Is there anything abnormal in browser console logs?