[PR #22710] [CLOSED] fix: Continue Response generates additional response instead of continuing #26826

Closed
opened 2026-04-20 06:43:54 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/open-webui/open-webui/pull/22710
Author: @BillionClaw
Created: 3/15/2026
Status: Closed

Base: mainHead: clawoss/fix/edit-continue-response


📝 Commits (1)

  • 8385111 fix: Continue Response generates additional response instead of continuing

📊 Changes

1 file changed (+11 additions, -2 deletions)

View changed files

📝 src/lib/components/chat/Chat.svelte (+11 -2)

📄 Description

Description

Fixes #21564

When clicking 'Continue Response', the LLM was generating a completely new response instead of continuing from where it left off.

Problem

The 'Continue Response' feature was not working properly because when the button was clicked, the frontend would:

  1. Set the current response message's done flag to false
  2. Re-send the conversation history to the LLM
  3. The LLM would generate a completely new response since it had no indication it should continue

Solution

Modified sendMessageSocket to accept an optional continuePrompt parameter. When continuing a response, a user message with content 'Continue' is appended to the messages sent to the LLM. This signals the LLM that it should continue from where it left off.

Changes

  • Added optional continuePrompt parameter to sendMessageSocket function
  • When continuePrompt is provided, append a user message with that content to the messages array
  • Modified continueResponse to pass { continuePrompt: 'Continue' } to sendMessageSocket

Testing

  • Type-checked the changes with npm run check
  • Linted the changes with npm run lint:frontend

This PR was created with assistance from an AI agent (ClawOSS).


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/open-webui/open-webui/pull/22710 **Author:** [@BillionClaw](https://github.com/BillionClaw) **Created:** 3/15/2026 **Status:** ❌ Closed **Base:** `main` ← **Head:** `clawoss/fix/edit-continue-response` --- ### 📝 Commits (1) - [`8385111`](https://github.com/open-webui/open-webui/commit/8385111d5ea690c3df0bac3f0a3142f3b948368b) fix: Continue Response generates additional response instead of continuing ### 📊 Changes **1 file changed** (+11 additions, -2 deletions) <details> <summary>View changed files</summary> 📝 `src/lib/components/chat/Chat.svelte` (+11 -2) </details> ### 📄 Description ## Description Fixes #21564 When clicking 'Continue Response', the LLM was generating a completely new response instead of continuing from where it left off. ### Problem The 'Continue Response' feature was not working properly because when the button was clicked, the frontend would: 1. Set the current response message's `done` flag to false 2. Re-send the conversation history to the LLM 3. The LLM would generate a completely new response since it had no indication it should continue ### Solution Modified `sendMessageSocket` to accept an optional `continuePrompt` parameter. When continuing a response, a user message with content 'Continue' is appended to the messages sent to the LLM. This signals the LLM that it should continue from where it left off. ### Changes - Added optional `continuePrompt` parameter to `sendMessageSocket` function - When `continuePrompt` is provided, append a user message with that content to the messages array - Modified `continueResponse` to pass `{ continuePrompt: 'Continue' }` to `sendMessageSocket` ### Testing - Type-checked the changes with `npm run check` - Linted the changes with `npm run lint:frontend` --- *This PR was created with assistance from an AI agent (ClawOSS).* --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-20 06:43:54 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#26826