mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-05 18:38:17 -05:00
[PR #11054] [CLOSED] feat - human-in-loop self-prompting via filters #9447
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/open-webui/open-webui/pull/11054
Author: @i-am-david-fernandez
Created: 3/2/2025
Status: ❌ Closed
Base:
dev← Head:feature/insert-prompt-via-filter📝 Commits (1)
531bf96Added function to set the prompt from a completed chat payload.📊 Changes
1 file changed (+14 additions, -0 deletions)
View changed files
📝
src/lib/components/chat/Chat.svelte(+14 -0)📄 Description
Pull Request Checklist
Note to first-time contributors: Please open a discussion post in Discussions and describe your changes before submitting a pull request.
Before submitting, make sure you've checked the following:
devbranch.Changelog Entry
Description
Certain broader AI/LLM functions require the ability for an LLM to self-prompt, that is, provide itself with a subsequent prompt in order to direct its subsequent responses. As a first step, and to aid safety and security, this should retain the human-in-the-loop such that the human operator may review and approve of the prompt, rather than have it automatically submitted.
Open-WebUI already has the ability, via
OutletFilters, to programatically extract content from an LLM's response; what is missing is the ability to feed any extracted material back via the (next) prompt. This PR adds this capability.Added
This changeset adds a single new frontend function to ease programmatic setting of the (next) prompt, i.e., prefilling the prompt text input. If a "chat completed" payload contains an optional
promptfield, this function is called to prefill the prompt with the payload-supplied value.Changed
N/A
Deprecated
N/A
Removed
N/A
Fixed
N/A
Security
N/A
Breaking Changes
N/A
Additional Information
A minimal
Filterto prefill the prompt is as follows:This simply provides a fixed value ("This is the next prompt.") to be the next prompt, but illustrates the mechanism by which an
OutletFiltermay provide a prompt. Additional processing/logic would allow arbitrary prompts to be provided, typically as a result of processing the LLM response.See also https://github.com/open-webui/open-webui/discussions/11041.
Screenshots or Videos
N/A
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.