feat: More native autocomplete prompt template #5076

Closed
opened 2025-11-11 16:11:31 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @cutekibry on GitHub (May 7, 2025).

Check Existing Issues

  • I have searched the existing issues and discussions.

Problem Description

Currently, the Backend send a full context of chat history in user's role with defined template. This might cause the LLM performs worse than providing natural chat histories (without template).

It might also cause some API endpoint services (like LiteLLM + Azure OpenAI GPT-4o, which I am using) cannot use the cache prompt feature.

Image

The core code of prompt template is described below.

a3bb7df610/backend/open_webui/config.py (L1499-L1539)

Desired Solution you'd like

The chat history which the Backend sent to API endpoint, in current implementation:

[
  {
    "role": "system",
    "content": "[DEFAULT_PROMPT_IN_MODEL_SETTINGS]",
  },
  {
    "role": "user",
    "content": "### Task: \nYou are an autocompletion system...### Context:\n<chat_history>\n[CHAT_HISTORY]..."
  }
]

Expected natural chat history to be sent:

[
  {
    "role": "system",
    "content": "[DEFAULT_PROMPT_IN_MODEL_SETTINGS]",
  },
  {
    "role": "user",
    "content": "### Task: \nYou are an autocompletion system..."
  }
  [CHAT_HISTORY]
]

Alternatives Considered

No response

Additional Context

No response

Originally created by @cutekibry on GitHub (May 7, 2025). ### Check Existing Issues - [x] I have searched the existing issues and discussions. ### Problem Description Currently, the Backend send a full context of chat history **in user's role with defined template**. This might cause the LLM performs worse than providing natural chat histories (without template). It might also cause some API endpoint services (like LiteLLM + Azure OpenAI GPT-4o, which I am using) cannot use the cache prompt feature. ![Image](https://github.com/user-attachments/assets/5f1082bb-c9a0-45fa-b525-552ba0e46c87) The core code of prompt template is described below. https://github.com/open-webui/open-webui/blob/a3bb7df61058e690a76cebb7681bd5390e77d226/backend/open_webui/config.py#L1499-L1539 ### Desired Solution you'd like The chat history which the Backend sent to API endpoint, in current implementation: ```json [ { "role": "system", "content": "[DEFAULT_PROMPT_IN_MODEL_SETTINGS]", }, { "role": "user", "content": "### Task: \nYou are an autocompletion system...### Context:\n<chat_history>\n[CHAT_HISTORY]..." } ] ``` Expected natural chat history to be sent: ```json [ { "role": "system", "content": "[DEFAULT_PROMPT_IN_MODEL_SETTINGS]", }, { "role": "user", "content": "### Task: \nYou are an autocompletion system..." } [CHAT_HISTORY] ] ``` ### Alternatives Considered _No response_ ### Additional Context _No response_
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#5076