mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #21564] issue: "Edit" and "Continue Response" don't work properly #58184
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @TheObserver-000 on GitHub (Feb 18, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/21564
Check Existing Issues
Installation Method
Docker
Open WebUI Version
0.8.3
Ollama Version (if applicable)
0.16.1
Operating System
Windows 11
Browser (if applicable)
Chrome 145
Confirmation
README.md.Expected Behavior
"Continue Response" must complete the "Paris is the capital of" with "France." and further attempts with "Continue Response" should not generate anything else because the response is finished, as the prompt says.
Actual Behavior
"Edit" Doesn't actually edit the message, and "Continue Response" generates an additional response instead of continuing the current one.
Steps to Reproduce
See the attached video.
Logs & Screenshots
https://github.com/user-attachments/assets/8a16b5ee-3956-40e7-b108-f171a379211f
Additional Information
No response
@Ghosthree3 commented on GitHub (Feb 18, 2026):
It has broken in two stages since 0.8.0.
In 0.8.0 responses that had been completed when edited would begin to write output from the end of where you edited the message, but appended to the original completed message. So in your video example, if you deleted 'France.' and hit continue it would output, "Paris is the capital of France.France.". If you stopped the output before it managed to complete then the continue output would function as intended after editing the response.
Then in 0.8.2 it broke further, now edits are ignored entirely and pressing continue gives an appended output similar to if you had pressed regenerate, appended to whatever the previous final output was (as shown in your video).
I've pinned to 0.8.1 for the time being as while it is annoying to lose the continue functionality if the output does complete, if I continuously catch and stop it before it does then it functions as it used to prior to 0.8.0 (and if it does complete I hit regenerate and quickly stop and paste in my edited output before continuing).
@MrOutsider commented on GitHub (Feb 20, 2026):
Same issue to the T.
My Ollama version is 0.16.2
@quark6789 commented on GitHub (Feb 21, 2026):
I have encountered the same thing.
@TheObserver-000 commented on GitHub (Feb 23, 2026):
As of v0.8.5, it's still not fixed
@andrewmm511 commented on GitHub (Feb 23, 2026):
Same. I use the continue functionality every day for pre-fill.
@Varalan commented on GitHub (Feb 26, 2026):
I'm getting the same exact thing with 0.8.5/docker/LM Studio 0.4.4
@BillyJin-NSRL-V-KVSHIN commented on GitHub (Mar 3, 2026):
0.8.8, still not fixed
@MenceyBentayga commented on GitHub (Mar 8, 2026):
PR still hasn't gone through at 0.8.9, saying it's waiting for a maintainer.
Isn't it kind of a basic feature to wait two weeks on?
@frenzybiscuit commented on GitHub (Mar 8, 2026):
Same issue.
I will let the LLM respond. Then I will go back and edit the LLM message.
Then, I reply.
It will recall the deleted portion of the response.
@MenceyBentayga commented on GitHub (Mar 9, 2026):
As of 0.8.10, the 20-line pull request is still waiting for a maintainer to see it since two weeks ago. Hopefully it is possible to merge the fix such an important user-facing feature soon.
@Iniquitatis commented on GitHub (Mar 9, 2026):
I'd argue a fundamental feature.
@Classic298 commented on GitHub (Mar 9, 2026):
has anyone tested the PR? I see zero comments under the PR of people testing it and confirming it also has no downstream effects or unintended side effects.
It would definitely be easier for Tim to merge it if anyone here would actually also test the PR and help get it merged.
@MenceyBentayga commented on GitHub (Mar 9, 2026):
Will do at my earliest possible chance and report back. Is there anything in particular that should be tested other than the visible stuff the changed code could affect?
@Classic298 commented on GitHub (Mar 9, 2026):
Test for the obvious which is what this issue is about
Continue response
Edit message
But also check around message branching
Edit your own message to create message branches and check if the correct branch is sent with the correct edited assistant message or correct continued response. And same if you branch the assistant message by regenerating the assistant message.
So yeah everything around message branching here should also be tested if that still behaves the same way and the fix correctly works for these cases also.
And what happens if i edit or continue response for older messages in the history
That's what i can think of right now would be good to test.
@sayan1999 commented on GitHub (Mar 11, 2026):
please fix this this is very very important feature
@sayan1999 commented on GitHub (Mar 15, 2026):
Why is none looking into this fundamental bug? Can anyone test the fix, and add comments to the pr?
@Classic298 commented on GitHub (Mar 15, 2026):
Hey @sayan1999 — to answer your questions directly:
"Why is no one looking into this?" — people are. Two PRs have been opened, one was closed because it was slop. The other is still open but someone already reported that the second PR is also not working at all. The issue is tracked. "No one looking into it" and "not yet merged" are very different things. Fixing message branching edge cases isn't a one-liner; there are real downstream risks which I outlined above, and a rushed merge of a broken fix is worse than the current state.
"Can anyone test the fix?" — Yes - you can. That's exactly what I asked the community to do a week ago. The PR has been sitting there with zero community-verified testing. If this feature matters enough for you to comment TWICE asking why it's not fixed, then it should matter enough for you to spend 10 minutes pulling the branch and running through the test cases I listed. That feedback directly helps Tim make the merge decision - or closing the second PR as well if it also turns out to not work.
Open source moves faster when people contribute, not just when they ask.
@Classic298 commented on GitHub (Mar 15, 2026):
@MenceyBentayga any chance you got around to testing yet?
@MenceyBentayga commented on GitHub (Mar 15, 2026):
Hello! I though I had seen a response saying that someone else had tested it and it wasn't working, so I didn't try it myself.
Now, I look and this thread and that response is nowhere? Maybe the user deleted it. I will test myself tonight, in that case.
@MenceyBentayga commented on GitHub (Mar 15, 2026):
Tried on 0.8.5 merging that pull request. Sadly, it doesn't seem to work.
@Classic298 commented on GitHub (Mar 15, 2026):
Thanks. Closing that pr.
@eniraa commented on GitHub (Mar 17, 2026):
Would appreciate if someone could take a look at my pull request (#22773), which only fixes message editing functionality. I'm happy to make any changes needed to get this merged.
@MenceyBentayga commented on GitHub (Mar 17, 2026):
LGTM!
@frenzybiscuit commented on GitHub (Mar 26, 2026):
Hello,
This is still an issue in 0.8.11. In addition, regenerations don't work properly either. The first time you regenerate the LLM will mix context and get confused. The -second- time you regenerate, it works fine.
@syndicatedshannon commented on GitHub (Mar 30, 2026):
I briefly attempted to verify this fix, but still saw the issue I believe this is intended to resolve. The edit appears to be completed, but then is undone when continuing any further operation.
Initial question:

Edit, deleting "Paris."

Then press "Continue":

Alternately edit, replacing "Paris" with "Rome":

Then press "Continue":

I didn't trace exactly what is happening, but based on the responses it seems to receive an amalgam of both the original and edited content.
Regarding the portion of the reported defect above that says "Continue" shouldn't produce any further text once it completes, that isn't my understanding of the "Continue" feature. My understanding is that if the model generates an end token, that make be the apparent outcome, but it is not a guarantee. Continue may be used to give the assistant another opportunity to generate a token besides an end token. My understanding may be wrong.
Specifically, I tested the OCI image I understand would be built by the PR, using the following:
docker build --no-cache -t open-webui-pr22773 https://github.com/eniraa/open-webui.git#dev@eniraa commented on GitHub (Mar 31, 2026):
Thanks for taking a look into the continuation functionality. My current fix only addresses the edit not working properly, but I'll take a crack at the continue issue once I have more time.
@syndicatedshannon commented on GitHub (Mar 31, 2026):
Thanks @eniraa . I apologize I misunderstood the PR (and maybe the issue) and I see that edit by itself appears to work correctly. When I edit and then submit a new user turn e.g. "Please repeat your last response verbatim.", I get the assistant response I'd expect.
@syndicatedshannon commented on GitHub (Mar 31, 2026):
also, I noticed that the behavior is different when streaming is enabled vs. disabled. When disabled, just the continuation is left in the assistant response (too little text). When enabled, the result above (too much text) is seen.
Afterwards (in both cases), the client sends the "api/chat/completed" text displayed in the UI, which then persists after refresh. Seems sus to me, but is probably a separate issue, if not intended.
@eniraa
@YashasviMantha commented on GitHub (Apr 4, 2026):
Still broken. I am on v0.8.12
@dathbe commented on GitHub (Apr 14, 2026):
Possibly a related issue (if not, let me know and I'll open another Issue). When I use Open-WebUI to edit an image, it will work fine. But if I ask it to "regenerate", it will lose all context and give me a text response. It's not yet clear to me whether it is dropping the uploaded image or unticking the "image" integration, but either way regenerate fails.
Running 0.8.12 via Docker ghcr.io/open-webui/open-webui:ollama
Update: I get the same issue when I try to create image (not using an input image), so the problem seems to be with the "image" integration getting unticked for the regenerate stage.
@CookSleep commented on GitHub (Apr 17, 2026):
In version 0.8.12, even after editing a model's response, sending a new message still results in the model seeing the old version. This issue is really frustrating me 😭
@MenceyBentayga commented on GitHub (Apr 21, 2026):
Given the branch has been merged, can someone confirm the behaviour for both edits and continuations is as expected in 0.9.0?
@superjamie commented on GitHub (Apr 21, 2026):
This is not fixed in v0.9.0.
Looking at the actual response in the llama-swap Activity tab, Open-WebUI is still sending the full previous response to the backend OpenAI server, not the new edited short response.
(i am still running v0.7.2 as a workaround. this is fine for me)
@taroxd commented on GitHub (Apr 26, 2026):
Not fixed in 0.9.2. When I want to modify the original message directly in SQLite DB, I find that the message edited in the webpage seems not being saved to DB.
My workaround is exporting the chat as json, removing the original content and importing it back. The removal is done by an AI-written python script.
@santiagozky commented on GitHub (Apr 27, 2026):
My python knowledge is limited. I found that when editing a response, the chat endpoint (update_chat_by_id ) is called, which updates a whole json into the chat table, but the chat_message table, which includes individual messages, does not seem to be updated at all. There is a function update_chat_message_by_id, but that does not seem to be called anywhere in the code
What I could not figure out is where the content that is send to the LLM is taken from. I assume it is from the chat_messages but I could not locate it. maybe someone can clarify why the chat content is duplicated in the database (a full json in chat and individual messages in chat_message). are both needed? I went through the old code and I could not figure it out either
(I mistakenly exposed a private url in a now deleted post).
@WuRunBear commented on GitHub (Apr 29, 2026):
这个bug一直没有被修复,如果有人急需edit的功能,可以尝试替换update_chat_by_id的代码,可以临时解决无法edit的问题,只要后续更新时,把这部分代码覆盖掉就好
但是这可能会影响其他功能正常运行,不过对我没啥影响,哈哈哈