mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-16 20:11:53 -05:00
enh: many model chat ui #997
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @chrisoutwright on GitHub (May 21, 2024).
Is your feature request related to a problem? Please describe.
I'm always frustrated when adding new models on top compresses the viewable chat canvas vertically, making it harder to read and interact with the chat. This reduces the usability and overall user experience, especially when multiple models are involved.
Describe the solution you'd like
I would like to have a toggle feature that allows the set of models to be displayed in one line or put them sideways in a separate boxing. This way, the chat canvas height remains unaffected, ensuring a consistent and readable chat interface. Ideally, this toggle could be implemented as a button or a dropdown that users can easily switch between compact and expanded views.
Describe alternatives you've considered
Horizontal Scroll: Allow the models to be displayed in a single horizontal line with a scroll bar if the models exceed the screen width.
Collapsible Menu: Implement a collapsible menu for the models that users can expand or collapse as needed.
Separate Section: Place the models in a separate section on the side of the chat interface, allowing the chat canvas to maintain its full height.
@kwekewk commented on GitHub (May 26, 2024):
i suggest similar to gemini draft, and its ability to regenerate
@dexwenway commented on GitHub (Jun 6, 2024):
This feature is very helpful for interacting with multiple models in a single conversation, and I hope that it can be improved in the future.
@Maralai commented on GitHub (Jun 15, 2024):
I would also like to also share some thoughts on the current implementation of running multiple models side by side in the interface. While I appreciate the effort behind this feature, I have a few suggestions and observations based on my experience.
User Interface Layout:
Prompt Targeting:
Model Interaction Enhancement:
Ultimately, the current multi-model feature seems geared more towards zero-shot evaluation rather than iterative prompting. It would be immensely helpful if there were a setting to revert back to the earlier interface behavior, enabling more targeted and model-specific interactions.
Thank you for considering these suggestions. I’m hopeful they can enhance the usability and effectiveness of the multi-model feature.
@chrisoutwright commented on GitHub (Jun 23, 2024):
What I originally meant is that (before we got the side-by-side, but still with top cutoff), the vertical cutoff is not ideal:

Now with side-by-side I wish there would be an option to specify the minimum width till when we get the output get collapsed via an arrow. At the moment it will need to be quite slim to do so:
This is really a bit difficult to read this way (see how the third one is literally getting like one token per line nearly)