[GH-ISSUE #2541] Can't change model_name in LiteLLM config (to have a user friendly display name) #12924

Closed
opened 2026-04-19 19:45:10 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @rsochard on GitHub (May 24, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2541

Bug Report

Description

Bug Summary:
When configuring a model using LiteLLM it is not possible to have the parameter 'model_name' different than 'litellm_params.model' to have a user friendly display name.

Steps to Reproduce:
Go to "Settings", then "Models".
Click on "Show" for "Manage LiteLLM Models".
Click on "Show Additional Params".
In the different fields below, set the correct parameters for an Azure OpenAI model, like below :

After adding this configuration an error occurs and display a "Network connection" issue

Expected Behavior:
"Model name" should only be used for a display name and not for configuring the litellm_params model, which is a different parameter.
Using a different value for "Model name" and litellm_params.model should not generate an error.

Environment

  • Open WebUI Version: [0.1.125]

Additional Information

I think the error is located in the file 'backend/apps/litellm/main.py' inside the function 'add_model_to_config()' when calling "get_llm_provider()".
I suggest the correction below :

293        get_llm_provider(model=form_data.litellm_params['model'])

Originally created by @rsochard on GitHub (May 24, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/2541 # Bug Report ## Description **Bug Summary:** When configuring a model using LiteLLM it is not possible to have the parameter 'model_name' different than 'litellm_params.model' to have a user friendly display name. **Steps to Reproduce:** Go to "Settings", then "Models". Click on "Show" for "Manage LiteLLM Models". Click on "Show Additional Params". In the different fields below, set the correct parameters for an Azure OpenAI model, like below : - Add a model = azure/xxxx - Model Name = GPT4 - API Base URL = https://xxxx.openai.azure.com - API Key = yyyyyyyyyyyy After adding this configuration an error occurs and display a "Network connection" issue **Expected Behavior:** "Model name" should only be used for a display name and not for configuring the litellm_params model, which is a different parameter. Using a different value for "Model name" and litellm_params.model should not generate an error. ## Environment - **Open WebUI Version:** [0.1.125] ## Additional Information I think the error is located in the file 'backend/apps/litellm/main.py' inside the function 'add_model_to_config()' when calling "get_llm_provider()". I suggest the correction below : ```` 293 get_llm_provider(model=form_data.litellm_params['model']) ````
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#12924