mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-10 15:54:15 -05:00
feat: More native autocomplete prompt template #5076
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @cutekibry on GitHub (May 7, 2025).
Check Existing Issues
Problem Description
Currently, the Backend send a full context of chat history in user's role with defined template. This might cause the LLM performs worse than providing natural chat histories (without template).
It might also cause some API endpoint services (like LiteLLM + Azure OpenAI GPT-4o, which I am using) cannot use the cache prompt feature.
The core code of prompt template is described below.
a3bb7df610/backend/open_webui/config.py (L1499-L1539)Desired Solution you'd like
The chat history which the Backend sent to API endpoint, in current implementation:
Expected natural chat history to be sent:
Alternatives Considered
No response
Additional Context
No response