[GH-ISSUE #5962] enh: prompt management w/ version control #52852

Closed
opened 2026-05-05 14:01:45 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @flefevre on GitHub (Oct 7, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/5962

Originally assigned to: @tjbck on GitHub.

Feature Request: Prompt Management System for Open WebUI

Is your feature request related to a problem? Please describe.
Managing prompts in Open WebUI can be cumbersome without a proper system. Hardcoded prompts lead to versioning challenges, complicate updates, and reduce flexibility for both technical and non-technical users. Currently, there is no efficient way to store, track, or manage prompts dynamically, limiting the ability to optimize or customize the system.

Describe the solution you'd like
I propose the implementation of a prompt management system within Open WebUI. This system would allow users to:

  • Store and track prompts centrally, enabling easier version control.
  • Update prompts dynamically without needing to redeploy the application.
  • Provide metrics on prompt performance and help in optimizing them.

Additionally, this system should be capable of integrating with existing prompt management solutions to avoid reinventing the wheel. For example, by allowing connection to a third-party system, users can seamlessly leverage external prompt management tools without extensive modifications to Open WebUI.

Describe alternatives you've considered
Manually updating and managing prompts within the codebase is the current solution, which requires redeployments for changes and lacks efficient version control. This method is not scalable and adds unnecessary complexity.

Additional context
Integration with an external prompt management system (such as an API) would allow teams to use industry-standard tools for managing prompts, reducing development overhead while increasing flexibility and control.

Originally created by @flefevre on GitHub (Oct 7, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/5962 Originally assigned to: @tjbck on GitHub. # Feature Request: Prompt Management System for Open WebUI **Is your feature request related to a problem? Please describe.** Managing prompts in Open WebUI can be cumbersome without a proper system. Hardcoded prompts lead to versioning challenges, complicate updates, and reduce flexibility for both technical and non-technical users. Currently, there is no efficient way to store, track, or manage prompts dynamically, limiting the ability to optimize or customize the system. **Describe the solution you'd like** I propose the implementation of a prompt management system within Open WebUI. This system would allow users to: - Store and track prompts centrally, enabling easier version control. - Update prompts dynamically without needing to redeploy the application. - Provide metrics on prompt performance and help in optimizing them. Additionally, this system should be capable of integrating with existing prompt management solutions to avoid reinventing the wheel. For example, by allowing connection to a third-party system, users can seamlessly leverage external prompt management tools without extensive modifications to Open WebUI. **Describe alternatives you've considered** Manually updating and managing prompts within the codebase is the current solution, which requires redeployments for changes and lacks efficient version control. This method is not scalable and adds unnecessary complexity. **Additional context** Integration with an external prompt management system (such as an API) would allow teams to use industry-standard tools for managing prompts, reducing development overhead while increasing flexibility and control.
Author
Owner

@Capsar commented on GitHub (Dec 3, 2024):

+1 I am looking forward to this feature, it would be very helpful!

<!-- gh-comment-id:2514250869 --> @Capsar commented on GitHub (Dec 3, 2024): +1 I am looking forward to this feature, it would be very helpful!
Author
Owner

@superjamie commented on GitHub (Dec 26, 2024):

This would be a cool feature.

The Msty chat frontend has a similar functionality. It provides:

  • a pre-populated Prompt Library
  • the ability to bookmark favourite prompts
  • the ability to store your own prompts
  • a quick UI button to apply a library prompt to a new chat
  • bookmarked prompts appearing at the top of that list
<!-- gh-comment-id:2562122780 --> @superjamie commented on GitHub (Dec 26, 2024): This would be a cool feature. The [Msty](https://msty.app/) chat frontend has a similar functionality. It provides: - a pre-populated Prompt Library - the ability to bookmark favourite prompts - the ability to store your own prompts - a quick UI button to apply a library prompt to a new chat - bookmarked prompts appearing at the top of that list
Author
Owner

@flefevre commented on GitHub (Dec 26, 2024):

Ok, thanks for this link.
My initial idea was to be a 'prompt management system '
Key element were

  • rbac over the prompt création, management and definition with team role
  • control of prompt with several version
  • one version dedicated to prompt in production
  • rollback button in case of divergence

Linking to langfuse could be also a good idea

<!-- gh-comment-id:2562177474 --> @flefevre commented on GitHub (Dec 26, 2024): Ok, thanks for this link. My initial idea was to be a 'prompt management system ' Key element were - rbac over the prompt création, management and definition with team role - control of prompt with several version - one version dedicated to prompt in production - rollback button in case of divergence Linking to langfuse could be also a good idea
Author
Owner

@superjamie commented on GitHub (Dec 26, 2024):

If you wanted to pre-populate prompts, I found a set of CC0 prompts here:

https://github.com/f/awesome-chatgpt-prompts

This appears to be mostly what Msty uses as well.

<!-- gh-comment-id:2562415082 --> @superjamie commented on GitHub (Dec 26, 2024): If you wanted to pre-populate prompts, I found a set of CC0 prompts here: https://github.com/f/awesome-chatgpt-prompts This appears to be mostly what Msty uses as well.
Author
Owner

@silentoplayz commented on GitHub (Aug 25, 2025):

Related - https://github.com/open-webui/open-webui/issues/3745

<!-- gh-comment-id:3218615444 --> @silentoplayz commented on GitHub (Aug 25, 2025): Related - https://github.com/open-webui/open-webui/issues/3745
Author
Owner

@tjbck commented on GitHub (Oct 2, 2025):

Soon.

<!-- gh-comment-id:3359358069 --> @tjbck commented on GitHub (Oct 2, 2025): Soon.
Author
Owner

@silentoplayz commented on GitHub (Oct 3, 2025):

Linking a closed PR of mine - https://github.com/open-webui/open-webui/pull/13991

<!-- gh-comment-id:3367493528 --> @silentoplayz commented on GitHub (Oct 3, 2025): Linking a closed PR of mine - https://github.com/open-webui/open-webui/pull/13991
Author
Owner

@flefevre commented on GitHub (Oct 25, 2025):

I’d like to share some thoughts on the potential value of deeper Langfuse compatibility in Open WebUI — while fully acknowledging that this discussion also has a strategic dimension for the project’s roadmap.
I’ll align with whatever direction the Open WebUI team decides to take, but I think it’s worth considering the broader ecosystem benefits.

💡 Why Langfuse compatibility could be valuable

  1. Prompt lifecycle and management
    Langfuse offers mature tools for managing prompts — including versioning, rollback, history tracking, and branching.
    These features help teams handle prompts as evolving assets rather than static strings, improving maintainability and experimentation.

  2. Traceability and compliance
    In some contexts (e.g. regulated industries), having full traceability of prompts — knowing which version was used, when and by whom — can be critical from a compliance or audit perspective.
    Langfuse already provides this kind of prompt lineage functionality.

  3. Ecosystem synergy
    Strengthening the connection between Open WebUI and Langfuse wouldn’t just benefit one project — it could help reinforce the open-source ecosystem as a whole.
    For example, Langfuse is already integrated with tools like LiteLLM for prompt management (docs.litellm.ai), showing that interoperability can unlock value across multiple components.

  4. Strategic coexistence
    I completely understand that Open WebUI needs to define its own identity and priorities.
    My view is simply that having one strong open-source UI and a healthy, interconnected ecosystem around it can make the community as a whole more resilient and innovative.

🔧 Possible integration ideas (if aligned with the project’s vision)

  • Optional backend or plugin to fetch/sync prompt templates from Langfuse
  • Logging of prompt versions and usage metadata back to Langfuse for traceability
  • UI selector to choose or roll back prompt versions directly from Langfuse

In short: I see this not as a competing direction, but as a potentially complementary integration that could strengthen both Open WebUI and the broader open-source LLM tooling ecosystem.

Happy to align with the team’s strategy and contribute ideas if this direction becomes relevant.

<!-- gh-comment-id:3445959971 --> @flefevre commented on GitHub (Oct 25, 2025): I’d like to share some thoughts on the potential value of deeper **Langfuse compatibility** in Open WebUI — while fully acknowledging that this discussion also has a **strategic dimension** for the project’s roadmap. I’ll align with whatever direction the Open WebUI team decides to take, but I think it’s worth considering the broader ecosystem benefits. ### 💡 Why Langfuse compatibility could be valuable 1. **Prompt lifecycle and management** Langfuse offers mature tools for managing prompts — including versioning, rollback, history tracking, and branching. These features help teams handle prompts as evolving assets rather than static strings, improving maintainability and experimentation. 2. **Traceability and compliance** In some contexts (e.g. regulated industries), having full traceability of prompts — knowing which version was used, when and by whom — can be critical from a compliance or audit perspective. Langfuse already provides this kind of *prompt lineage* functionality. 3. **Ecosystem synergy** Strengthening the connection between Open WebUI and Langfuse wouldn’t just benefit one project — it could help reinforce the open-source ecosystem as a whole. For example, Langfuse is already integrated with tools like **LiteLLM** for prompt management ([docs.litellm.ai](https://docs.litellm.ai/docs/proxy/prompt_management)), showing that interoperability can unlock value across multiple components. 4. **Strategic coexistence** I completely understand that Open WebUI needs to define its own identity and priorities. My view is simply that having one strong open-source UI **and** a healthy, interconnected ecosystem around it can make the community as a whole more resilient and innovative. ### 🔧 Possible integration ideas (if aligned with the project’s vision) - Optional backend or plugin to fetch/sync prompt templates from Langfuse - Logging of prompt versions and usage metadata back to Langfuse for traceability - UI selector to choose or roll back prompt versions directly from Langfuse In short: I see this not as a competing direction, but as a **potentially complementary integration** that could strengthen both Open WebUI and the broader open-source LLM tooling ecosystem. Happy to align with the team’s strategy and contribute ideas if this direction becomes relevant.
Author
Owner

@druellan commented on GitHub (Oct 25, 2025):

I'm right now learning prompting, and I'm already in the v5 of the small prompt I'm building for a project, and I'm feeling the need to proper versioning and testing. To have versioning in OWUI can be very helpful, specially for agents, where microadjusting a prompt is a thing, but I agree that proper integration with tools like Langfuse unlocks another level, with a proper lifecycle, testing and even orchestration.
OWUI also lacks a universal system prompt, so a single repository also helps to keep several models up-to-date.

<!-- gh-comment-id:3447812109 --> @druellan commented on GitHub (Oct 25, 2025): I'm right now learning prompting, and I'm already in the v5 of the small prompt I'm building for a project, and I'm feeling the need to proper versioning and testing. To have versioning in OWUI can be very helpful, specially for agents, where microadjusting a prompt is a thing, but I agree that proper integration with tools like Langfuse unlocks another level, with a proper lifecycle, testing and even orchestration. OWUI also lacks a universal system prompt, so a single repository also helps to keep several models up-to-date.
Author
Owner

@flefevre commented on GitHub (Nov 18, 2025):

I have continue to explore this key topic.

We need perhaps to look how Litellm is moving by integrating prompt in gitops.
They have created a binder to git repo with file extension .prompt

I think it will be key to have a standard between LiteLLM, langfuse and openwebui

<!-- gh-comment-id:3546516847 --> @flefevre commented on GitHub (Nov 18, 2025): I have continue to explore this key topic. We need perhaps to look how Litellm is moving by integrating prompt in gitops. They have created a binder to git repo with file extension .prompt - https://docs.litellm.ai/docs/proxy/native_litellm_prompt - https://docs.litellm.ai/docs/proxy/prompt_management I think it will be key to have a standard between [LiteLLM](https://github.com/BerriAI/litellm/discussions/16056), [langfuse](https://github.com/orgs/langfuse/discussions/6561) and openwebui
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#52852