[GH-ISSUE #1724] Add option to have saved profiles #51280

Closed
opened 2026-05-05 12:13:23 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @ccrvlh on GitHub (Apr 24, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1724

Currently, there's no way to have multiple "profile" of LLMs (eg. saved system prompts / parameters / models), so even though you do have the ability to have saved prompts and quickly change models, you're always talking to the same "profile". There are "modelfiles" but from what I understood those are related to Ollama, and not to profiles for any model (eg. API-based).

Overview

One of the main benefits of having "profiles" is being able to route questions: if you have a system prompt for a screen writer with high temperature and high token for example it could be super useful for writing, but terrible for coding. When coding, you probably want a "you're a software developer expert ..." sort of thing, with a mid temperature and maybe high max tokens, etc. For powerful models, you may want more straight forward answers to reduce the token count, and for cheaper/faster models, you may want something different.

Proposed Solution

Having a new concept of "Agents" or "Profiles" or "Personas" or anything like that. To reduce implemention costs/time, that could be a menu, right above (or below) the "Prompts" menu, and could work in a very similar fashion, but on the "Create New" screen you would also be able to define the system prompt, the model and the model parameters. The experience could also be similar to the prompt /some-profile and that would load that profile.

Another option, would be have a more flexibility "Modelfiles" implementation, allowing to select not only Ollama-based models, but also API-based models.

Describe alternatives you've considered
Currently there's really no way to do this other than manually changing the system prompt and the model params. Lobe Chat has an interesting implementation in which they have no only the chat history but also the "Assistants" which are just saved sys prompts/models/params.

Additional context
Would be happy to contribute to this.

Let me know if this makes sense, or if I'm understanding the current Modelfiles implementation incorrectly, maybe it's already possible to do that and I couldn't figure how.

Originally created by @ccrvlh on GitHub (Apr 24, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/1724 Currently, there's no way to have multiple "profile" of LLMs (eg. saved system prompts / parameters / models), so even though you do have the ability to have saved prompts and quickly change models, you're always talking to the same "profile". There are "modelfiles" but from what I understood those are related to Ollama, and not to profiles for any model (eg. API-based). ### Overview One of the main benefits of having "profiles" is being able to route questions: if you have a system prompt for a screen writer with high temperature and high token for example it could be super useful for writing, but terrible for coding. When coding, you probably want a "you're a software developer expert ..." sort of thing, with a mid temperature and maybe high max tokens, etc. For powerful models, you may want more straight forward answers to reduce the token count, and for cheaper/faster models, you may want something different. ### Proposed Solution Having a new concept of "Agents" or "Profiles" or "Personas" or anything like that. To reduce implemention costs/time, that could be a menu, right above (or below) the "Prompts" menu, and could work in a very similar fashion, but on the "Create New" screen you would also be able to define the system prompt, the model and the model parameters. The experience could also be similar to the prompt `/some-profile` and that would load that profile. Another option, would be have a more flexibility "Modelfiles" implementation, allowing to select not only Ollama-based models, but also API-based models. **Describe alternatives you've considered** Currently there's really no way to do this other than manually changing the system prompt and the model params. Lobe Chat has an interesting implementation in which they have no only the chat history but also the "Assistants" which are just saved sys prompts/models/params. **Additional context** Would be happy to contribute to this. Let me know if this makes sense, or if I'm understanding the current Modelfiles implementation incorrectly, maybe it's already possible to do that and I couldn't figure how.
Author
Owner

@tjbck commented on GitHub (Apr 24, 2024):

Great suggestions! Let's continue our discussion here: #665

<!-- gh-comment-id:2075588721 --> @tjbck commented on GitHub (Apr 24, 2024): Great suggestions! Let's continue our discussion here: #665
Author
Owner

@ccrvlh commented on GitHub (Apr 24, 2024):

Thanks! Missed that post, didn't know how to look for it tbh - that's fits the bill perfectly.

<!-- gh-comment-id:2075590462 --> @ccrvlh commented on GitHub (Apr 24, 2024): Thanks! Missed that post, didn't know how to look for it tbh - that's fits the bill perfectly.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#51280