[GH-ISSUE #1227] feat: Model Whitelisting for LiteLLM #12403

Closed
opened 2026-04-19 19:19:21 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @flyfox666 on GitHub (Mar 20, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1227

Is your feature request related to a problem? Please describe.
I have both local model and litellm model in use, then when assigning to registered users, I found that I can only use model whistling function for local model, litellm connected model can still be selected in users. Is it possible to limit Model Whitelisting to litellm models as well?

Describe the solution you'd like
Model Whistling function could also be affected in litellm connected model

Describe alternatives you've considered
Set up password for litellm connected model useing

1
2

Originally created by @flyfox666 on GitHub (Mar 20, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/1227 **Is your feature request related to a problem? Please describe.** I have both local model and litellm model in use, then when assigning to registered users, I found that I can only use model whistling function for local model, litellm connected model can still be selected in users. Is it possible to limit Model Whitelisting to litellm models as well? **Describe the solution you'd like** Model Whistling function could also be affected in litellm connected model **Describe alternatives you've considered** Set up password for litellm connected model useing ![1](https://github.com/open-webui/open-webui/assets/121539277/e8b2965d-1705-4e18-8a6c-be77ef58f95a) ![2](https://github.com/open-webui/open-webui/assets/121539277/b3dccdd3-5162-4fcb-bf56-721678438861)
Author
Owner

@justinh-rahb commented on GitHub (Mar 20, 2024):

In the Ollama and OpenAI endpoints, you have no control over what the /api/models or /v1/models endpoints return, so a whitelist was deemed useful to limit what's displayed to users. In LiteLLM, you control which models you add so it was assumed that admins would only add models they want their users to use. More granular access controls in general are still on the roadmap.

<!-- gh-comment-id:2008704231 --> @justinh-rahb commented on GitHub (Mar 20, 2024): In the Ollama and OpenAI endpoints, you have no control over what the `/api/models` or `/v1/models` endpoints return, so a whitelist was deemed useful to limit what's displayed to users. In LiteLLM, you control which models you add so it was assumed that admins would only add models they want their users to use. More granular access controls in general are still on the roadmap.
Author
Owner

@flyfox666 commented on GitHub (Mar 20, 2024):

Oh ,thanks, justinh-rahb,
Appreciate if it is possible to update this fuction in future.
The main reason for my issue is sometimes could only Litellm for my wife and not for my friend.

<!-- gh-comment-id:2008707136 --> @flyfox666 commented on GitHub (Mar 20, 2024): Oh ,thanks, justinh-rahb, Appreciate if it is possible to update this fuction in future. The main reason for my issue is sometimes could only Litellm for my wife and not for my friend.
Author
Owner

@justinh-rahb commented on GitHub (Mar 20, 2024):

I agree it would be good to extend the feature to cover LiteLLM as well, just trying to shed light on the possible reasons it was overlooked or deferred.

<!-- gh-comment-id:2008708869 --> @justinh-rahb commented on GitHub (Mar 20, 2024): I agree it would be good to extend the feature to cover LiteLLM as well, just trying to shed light on the possible reasons it was overlooked or deferred.
Author
Owner

@flyfox666 commented on GitHub (Mar 20, 2024):

much appreicate

<!-- gh-comment-id:2008710792 --> @flyfox666 commented on GitHub (Mar 20, 2024): much appreicate
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#12403