mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
[GH-ISSUE #17997] feat: Variables in recommended prompts #57124
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @aleksanderson94 on GitHub (Oct 2, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/17997
Check Existing Issues
Problem Description
Discussed #17961
Desired Solution you'd like
Implement a feature that allows specifying a command (instead of static text) within the recommended prompts for a model. The expected behavior is that upon clicking a recommended prompt containing such a command, the system should properly launch the input window for the user to enter values for the variables.
Alternatives Considered
No response
Additional Context
No response
@tjbck commented on GitHub (Oct 2, 2025):
Hmm, This should be supported today with curly bracket variables no?
@aleksanderson94 commented on GitHub (Oct 2, 2025):
To clarify, the idea isn't to write the prompt with variables directly in the recommendations themselves. The intended workflow is to create the prompt in the Workspace, and then in the recommendations, specify only the command to call it, perhaps with some accompanying text. This used to work before (possibly unintentionally), but it has stopped working since the 0.6.27 update.
@HannesStrohkopp commented on GitHub (Oct 2, 2025):
I concur with this request.
Actually, this functionality was a very important part of getting into the game the part of my users which have not too much experience with inference:
By simply selecting a use case in the suggestions below the chat bar (like e.g. "Professionalize your text...") which then auto-pasted a prepared prompt (e.g. "/professionalize_text") which then led to the actual prompt with variable field was a very low entry hurdle for them to understand prompts and how to use the UI.
Now, in this workflow, the text in the suggestion (in this example "/professionalize_text") is directly pasted and sent to the LLM without resolving the prompt template and thus the possibility to enter the variables.
It would be great if this workflow would come back to functionality!
@HannesStrohkopp commented on GitHub (Oct 6, 2025):
Actually, to add on to this.
I think the easiest solution would be to remove the current auto-click when choosing a suggestion.
Then, the functionality of the aforementioned use-case would be reinstated while any other suggestions would require only one additonal click/enter.
@HannesStrohkopp commented on GitHub (Oct 28, 2025):
Kinda sad this was closed/removed.
Forces me to remove all suggestions in production. :(
@Classic298 commented on GitHub (Dec 14, 2025):
prompt suggestions have been refactored and can now contain variables - you only need to set it so that prompt suggestions should be inserted into the message field instead of directly sent as a message
@HannesStrohkopp commented on GitHub (Dec 15, 2025):
Thanks for the follow-up.
Is that a setting on admin level now to insert "instead of send"?
@Classic298 commented on GitHub (Dec 15, 2025):
It's in user interface settings
@HannesStrohkopp commented on GitHub (Dec 15, 2025):
I see. Unfortunately, that does not really solve the issue for me. It does not help if this setting is user-specific and/or no default can be set as admin.
Most users do not go through the settings and just roll with the default config. Thus, I cannot use prompt suggestions as advertisement for preconfigured prompts. :/
@Classic298 commented on GitHub (Dec 15, 2025):
@HannesStrohkopp it does help because a PR is ready where admins can define the default interface settings for users and hopefully will merge soon
@HannesStrohkopp commented on GitHub (Dec 16, 2025):
@Classic298 Ahhh, that is perfect, thanks a ton!