[GH-ISSUE #2647] enh: save response before switching chat #28489

Closed
opened 2026-04-25 03:06:01 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @0x7CFE on GitHub (May 29, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2647

Originally assigned to: @tjbck on GitHub.

Bug Report

Description

Bug Summary:
Response lost if chat is switched during generation

Steps to Reproduce:

  1. Ask something in the chat window
  2. LLM starts generating the response
  3. Switch to another topic by selecting previous chats
  4. Return to original topic
  5. Realize that you can't see what is going on. CPU is busy, but nothing is being shown in the UI, there is no ⏹️ button, etc. There is no way to abort the generation.
  6. Only after LLM finishes generation the whole reply is shown, all at once.

Expected Behavior:
I expect that the generation would happen in the background and I would be able to see the progress when I return to the chat where the question was asked. I should have an option to abort the generation before it finishes.

Actual Behavior:
Reply being generated is invisible, no way to abort.

Environment

  • Open WebUI Version: v0.1.123

  • Ollama (if applicable): 0.1.39

  • Operating System: Ubuntu 22.04

  • Browser (if applicable): Firefox 125.0

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @0x7CFE on GitHub (May 29, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/2647 Originally assigned to: @tjbck on GitHub. # Bug Report ## Description **Bug Summary:** Response lost if chat is switched during generation **Steps to Reproduce:** 1. Ask something in the chat window 2. LLM starts generating the response 3. Switch to another topic by selecting previous chats 4. Return to original topic 5. Realize that you can't see what is going on. CPU is busy, but nothing is being shown in the UI, there is no :stop_button: button, etc. There is no way to abort the generation. 6. Only after LLM finishes generation the whole reply is shown, all at once. **Expected Behavior:** I expect that the generation would happen in the background and I would be able to see the progress when I return to the chat where the question was asked. I should have an option to abort the generation before it finishes. **Actual Behavior:** Reply being generated is invisible, no way to abort. ## Environment - **Open WebUI Version:** v0.1.123 - **Ollama (if applicable):** 0.1.39 - **Operating System:** Ubuntu 22.04 - **Browser (if applicable):** Firefox 125.0 ## Reproduction Details **Confirmation:** - [ ] I have read and followed all the instructions provided in the README.md. - [ ] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. ## Logs and Screenshots **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** [Include relevant Docker container logs, if applicable] **Screenshots (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Installation Method [Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.] ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@hershal commented on GitHub (Jun 3, 2024):

I'm seeing the same issue. If I navigate away from the chat before the LLM finishes generating text (e.g., clicking on another chat), then my latest input and response is lost.

<!-- gh-comment-id:2144497350 --> @hershal commented on GitHub (Jun 3, 2024): I'm seeing the same issue. If I navigate away from the chat before the LLM finishes generating text (e.g., clicking on another chat), then my latest input and response is lost.
Author
Owner

@skobkin commented on GitHub (Jun 4, 2024):

I'd say that it isn't a bug, but a good feature request.

<!-- gh-comment-id:2148512922 --> @skobkin commented on GitHub (Jun 4, 2024): I'd say that it isn't a bug, but a good feature request.
Author
Owner

@0x7CFE commented on GitHub (Jun 5, 2024):

Well, bug it or not, it does not pass the principle of least surprise test. User ends up in a situation where she seemingly lost the pending generation, has a hot CPU, and with no button to abort the generation.

<!-- gh-comment-id:2148964182 --> @0x7CFE commented on GitHub (Jun 5, 2024): Well, bug it or not, it does not pass the principle of least surprise test. User ends up in a situation where she seemingly lost the pending generation, has a hot CPU, and with no button to abort the generation.
Author
Owner

@TFWol commented on GitHub (Jul 8, 2024):

What should be done is have an entry in the sidebar as soon as New Chat button is pressed.

<!-- gh-comment-id:2213106484 --> @TFWol commented on GitHub (Jul 8, 2024): What should be done is have an entry in the sidebar as soon as New Chat button is pressed.
Author
Owner

@TFWol commented on GitHub (Aug 4, 2024):

@skobkin Explain your thumbs down

<!-- gh-comment-id:2267578650 --> @TFWol commented on GitHub (Aug 4, 2024): @skobkin Explain your thumbs down
Author
Owner

@skobkin commented on GitHub (Aug 4, 2024):

@TFWol I plead the Fifth!

Jokes aside, I don't want a new chat tab appear each time I see a new chat. Especially considering that "new chat" is a basically an index page of OpenWebUI. If it'd be implemented in a way you suggest, I'll be removing useless empty chats on a regular basis.

I see the current approach when the chat is "created" only when it's state changes from default/blank as a good choice in the current UI.

But implementing something in between isn't a bad idea as I see it. For example OpenWebUI can create new chat named "Draft" (or something like that) if you wrote something, but then switched outside of the chat like many email applications or web services do.

<!-- gh-comment-id:2267597244 --> @skobkin commented on GitHub (Aug 4, 2024): @TFWol I plead the Fifth! Jokes aside, I don't want a new chat tab appear each time I see a new chat. Especially considering that "new chat" is a basically an index page of OpenWebUI. If it'd be implemented in a way you suggest, I'll be removing useless empty chats on a regular basis. I see the current approach when the chat is "created" only when it's state changes from default/blank as a good choice in the current UI. But implementing something in between isn't a bad idea as I see it. For example OpenWebUI can create new chat named "Draft" (or something like that) if you wrote something, but then switched outside of the chat like many email applications or web services do.
Author
Owner

@TFWol commented on GitHub (Aug 4, 2024):

I'll be removing useless empty chats on a regular basis.

Oh, I didn't mean to keep blank chats around; that would drive me crazy.

I was going with the assumption that it would make the blank ones disappear automatically if nothing was typed. I had mentioned the new chat entry because when I type part of a prompt and want to look at something like a chatbot setting real quick, I end up losing everything I typed.

For example OpenWebUI can create new chat named "Draft" (or something like that)

I pretty much meant this 😄
Having a new chat placeholder for non-empty boxes would be a reminder that you have something there.

<!-- gh-comment-id:2267600262 --> @TFWol commented on GitHub (Aug 4, 2024): > I'll be removing useless empty chats on a regular basis. Oh, I didn't mean to keep blank chats around; that would drive me **crazy**. I was going with the assumption that it would make the blank ones disappear automatically if nothing was typed. I had mentioned the new chat entry because when I type part of a prompt and want to look at something like a chatbot setting real quick, I end up losing everything I typed. > For example OpenWebUI can create new chat named "Draft" (or something like that) I pretty much meant this 😄 Having a new chat placeholder for non-empty boxes would be a reminder that you have something there.
Author
Owner

@skobkin commented on GitHub (Aug 4, 2024):

@TFWol Sure.

If it's created only when you started typing and removed if it's empty again then I'm fine with such behavior.

<!-- gh-comment-id:2267602790 --> @skobkin commented on GitHub (Aug 4, 2024): @TFWol Sure. If it's created only when you started typing and removed if it's empty again then I'm fine with such behavior.
Author
Owner

@navilg commented on GitHub (Nov 26, 2024):

Is this fix on roadmap ? I dont see any update since last 4 months.

<!-- gh-comment-id:2501551437 --> @navilg commented on GitHub (Nov 26, 2024): Is this fix on roadmap ? I dont see any update since last 4 months.
Author
Owner

@tjbck commented on GitHub (Nov 26, 2024):

@navilg It's not the highest priority atm, however, PR Welcome!

<!-- gh-comment-id:2501578504 --> @tjbck commented on GitHub (Nov 26, 2024): @navilg It's not the highest priority atm, however, PR Welcome!
Author
Owner

@tjbck commented on GitHub (Dec 19, 2024):

Fixed on dev!

<!-- gh-comment-id:2553137951 --> @tjbck commented on GitHub (Dec 19, 2024): Fixed on dev!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#28489