mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-10 07:43:10 -05:00
feat: Option to use first few words of prompt as title (no AI) #5509
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @VanceVagell on GitHub (Jun 11, 2025).
Check Existing Issues
Problem Description
I run local LLMs and serve them through Open WebUI. Because some of my LLMs are slower CPU inference systems, I turn off title generation because I don't want it to waste time having to generate that, I'd prefer it be quicker to respond to my second query immediately.
Desired Solution you'd like
It would be very helpful to have an option like:
Settings > Interface > Title generation style
with two options (maybe a drop-down?):
The start of the prompt would be good enough if many cases to jog my memory about what a conversation was (far better than the "New Chat" that they all have right now with title generation off), without incurring the non-trivial cost of summarization for slower local models.
Alternatives Considered
For now I just turn off title generation, which isn't optimal because it's hard to then find a specific conversation in the history.
Additional Context
No response
@sreesdas commented on GitHub (Jun 16, 2025):
While the development team works on implementing the feature, you can meanwhile use this title generation prompt as a workaround. Even a lighter model like Ollama (LLaMA 3.2:1B) running on a CPU can achieve this relatively quickly.
Task:
Extract the first 4-6 words generated by the assistant
Guidelines:
Output:
JSON format: { "title": "your extracted title here ..." }
Examples:
Chat History:
<chat_history>
{{MESSAGES:END:2}}
</chat_history>
@tjbck commented on GitHub (Jun 16, 2025):
Addressed with
ea578af45f