[GH-ISSUE #13123] issue: Streaming output of RAGFlow only works in frontend "TypeScript", but doesn't work in backend "Routers" #16816

Closed
opened 2026-04-19 22:38:36 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @AlexRice13 on GitHub (Apr 22, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/13123

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.6.5

Ollama Version (if applicable)

No response

Operating System

Windows 11

Browser (if applicable)

Chrome 134.0.6998.166

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

RAGFlow is a RAG framework that provides OpenAI compatible API access for chat sessions, therefore I tried access it through 'admin/setting/external_connections' of OpenWebUI.
Hoping RAGFlow response requests of OpenWebUI in streaming way.

Actual Behavior

RAGFlow doesn't response OpenWebUI in streaming way, but in a completed content, causing gigantic response latency, which is far out of people's patient.

I have tried switching the "direct_connection" toggle on and off in admin settings, set the parameters streaming to True, but neither of them works.

However, I tried another external_connections in UserSetting "src/lib/apis/openai/index.ts"(said need correct CORS), the streaming output just works.

Steps to Reproduce

1.Set up a RAGFlow OpenAI compatible API
2.Set an external_connections in admin/setting/, fill api endpoint url with "http://ragflow_url/api/v1/chats_openai/c0161ad007d111f08ad10242ac120006", fill API key.
3.Add random model name, because RAGFlow will resolve it automatically.
5.Enable direct connection under admin/setting/external_connections.
6.Set streaming to True in model parameters.
7.Start a chat, and you won't get streaming response.

8.Try same in 'user/settings' which is completely handled by frontend "src/lib/apis/openai/index.ts".
9.It just works.

Logs & Screenshots

Expected way:
Image
Image

Actual Behavior:
Image

How the stream one works:
Image

Additional Information

Using frontend way it works, but I can't share it with users other than admin.

Originally created by @AlexRice13 on GitHub (Apr 22, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/13123 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.6.5 ### Ollama Version (if applicable) _No response_ ### Operating System Windows 11 ### Browser (if applicable) Chrome 134.0.6998.166 ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior RAGFlow is a RAG framework that provides OpenAI compatible API access for chat sessions, therefore I tried access it through 'admin/setting/external_connections' of OpenWebUI. Hoping RAGFlow response requests of OpenWebUI in streaming way. ### Actual Behavior RAGFlow doesn't response OpenWebUI in streaming way, but in a completed content, causing gigantic response latency, which is far out of people's patient. I have tried switching the "direct_connection" toggle on and off in admin settings, set the parameters streaming to True, but neither of them works. However, I tried another external_connections in UserSetting "src/lib/apis/openai/index.ts"(said need correct CORS), the streaming output just works. ### Steps to Reproduce 1.Set up a RAGFlow OpenAI compatible API 2.Set an external_connections in admin/setting/, fill api endpoint url with "http://ragflow_url/api/v1/chats_openai/c0161ad007d111f08ad10242ac120006", fill API key. 3.Add random model name, because RAGFlow will resolve it automatically. 5.Enable direct connection under admin/setting/external_connections. 6.Set streaming to True in model parameters. 7.Start a chat, and you won't get streaming response. 8.Try same in 'user/settings' which is completely handled by frontend "src/lib/apis/openai/index.ts". 9.It just works. ### Logs & Screenshots Expected way: <img width="1440" alt="Image" src="https://github.com/user-attachments/assets/d9f21877-5458-4df2-9f39-d679f70ab38b" /> <img width="1440" alt="Image" src="https://github.com/user-attachments/assets/f2a0ac56-82c5-4c29-bce2-8994e3c4f711" /> Actual Behavior: <img width="1440" alt="Image" src="https://github.com/user-attachments/assets/8ff433b7-2e08-49ea-ae82-fffab5431fc7" /> How the stream one works: <img width="1440" alt="Image" src="https://github.com/user-attachments/assets/aeafa2d0-2f62-45b3-9269-9850e20732b6" /> ### Additional Information Using frontend way it works, but I can't share it with users other than admin.
GiteaMirror added the bug label 2026-04-19 22:38:36 -05:00
Author
Owner

@tth37 commented on GitHub (Apr 22, 2025):

Is there anything abnormal in browser console logs?

<!-- gh-comment-id:2819967648 --> @tth37 commented on GitHub (Apr 22, 2025): Is there anything abnormal in browser console logs?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#16816