[GH-ISSUE #2832] Feat: Set default loaded model if OpenAI endpoint lists a loaded model #13031

Closed
opened 2026-04-19 19:51:13 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @bartowski1182 on GitHub (Jun 5, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2832

I've been using tabbyAPI, and particularly I've included in the completion endpoint the ability to set the loaded model before answering.

This makes the default new-chat option a bit more difficult since I would prefer it auto-select the already loaded model, and I can change if if needed

So it would be nice if, when opening a new chat, the code checked if the OpenAI endpoint offers a /model endpoint, and if it does and returns a model, it should be set as the current model

Originally created by @bartowski1182 on GitHub (Jun 5, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/2832 I've been using tabbyAPI, and particularly I've included in the completion endpoint the ability to set the loaded model before answering. This makes the default new-chat option a bit more difficult since I would prefer it auto-select the already loaded model, and I can change if if needed So it would be nice if, when opening a new chat, the code checked if the OpenAI endpoint offers a /model endpoint, and if it does and returns a model, it should be set as the current model
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#13031