[GH-ISSUE #183] "New Chat" created per word streamed, when you click it during message generation. #50633

Closed
opened 2026-05-05 10:46:16 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @leochoo on GitHub (Dec 6, 2023).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/183

Describe the bug
When you click on "New Chat" while the message is being streamed, "New Chat" will be created per word streamed it seems.

To Reproduce
Steps to reproduce the behavior:

  1. Choose any model and generate a message
  2. While generating, Click on 'New Chat'
  3. You will see a creation of New Chat, which stop when the answer stream stops.

Expected behavior
Message generation stops and the new chat starts.

Screenshots
CleanShot 2023-12-06 at 19 26 35

Desktop (please complete the following information):

  • OS: MacOS
  • Browser Edge
  • Version 119.0.2151.72 (Official build) (arm64)

Additional context
I'm just using frontend. I am NOT using Docker. I am NOT using the backend.
Instead, I'm using the regular ollama previously installed.
I'm running it via npm run dev and ollama via ollama serve.
Frontend .env points to OLLAMA_API_BASE_URL='http://127.0.0.1:11434/api'

Originally created by @leochoo on GitHub (Dec 6, 2023). Original GitHub issue: https://github.com/open-webui/open-webui/issues/183 **Describe the bug** When you click on "New Chat" while the message is being streamed, "New Chat" will be created per word streamed it seems. **To Reproduce** Steps to reproduce the behavior: 1. Choose any model and generate a message 2. While generating, Click on 'New Chat' 4. You will see a creation of New Chat, which stop when the answer stream stops. **Expected behavior** Message generation stops and the new chat starts. **Screenshots** ![CleanShot 2023-12-06 at 19 26 35](https://github.com/ollama-webui/ollama-webui/assets/26427048/5c35ae64-ca4c-493f-b506-df52ff1ae8f5) **Desktop (please complete the following information):** - OS: MacOS - Browser Edge - Version 119.0.2151.72 (Official build) (arm64) **Additional context** I'm just using frontend. I am NOT using Docker. I am NOT using the backend. Instead, I'm using the regular ollama previously installed. I'm running it via `npm run dev` and ollama via `ollama serve`. Frontend .env points to `OLLAMA_API_BASE_URL='http://127.0.0.1:11434/api'`
Author
Owner

@tjbck commented on GitHub (Dec 6, 2023):

Hi, Thanks for creating this issue! Taking a look right now, stay tuned!

<!-- gh-comment-id:1842997331 --> @tjbck commented on GitHub (Dec 6, 2023): Hi, Thanks for creating this issue! Taking a look right now, stay tuned!
Author
Owner

@tjbck commented on GitHub (Dec 6, 2023):

Hi, I just merged the branch with the fix to main, Please let me know if the issue still persists after you try out the latest release. Thanks!

<!-- gh-comment-id:1843242462 --> @tjbck commented on GitHub (Dec 6, 2023): Hi, I just merged the branch with the fix to main, Please let me know if the issue still persists after you try out the latest release. Thanks!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#50633