mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
[GH-ISSUE #945] Unable to get model list from litellm proxy #27790
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @shekhars-li on GitHub (Feb 27, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/945
Bug Report
Description
Bug Summary:
I have setup a litellm proxy. Both the containers (open-webui and litellm proxy) are running properly. I can access my litellm-proxy endpoint and query the model list from open-webui container console (python interpreter -> get model list by passing the master_key). However, on loading the open-webui, the model list only shows ollama models and not the litellm models I configured. I get this error:
Looks like the GET requests succeeds but the json is not loaded properly.
I tried to get the josn list myself inside the open-webui container and I see the request is successful. Can you please help me fix this?
Steps to Reproduce:
Here's my docker-compose.yml:
Here's my litellm config:
Expected Behavior:
Model list shows up via litellm proxy and shows anthropic models
Actual Behavior:
Error when trying to list all the models
Environment
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
None
Docker Container Logs:
Screenshots (if applicable):

Installation Method
Docker installation - shared docker compose file above.
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
@justinh-rahb commented on GitHub (Feb 27, 2024):
Try changing
OPENAI_API_BASE_URL=http://openai-proxy:8000/v1toOPENAI_API_BASE_URL=http://host.docker.internal:4000/v1. I've had some issues where the LiteLLM proxy doesn't like to listen from the internal container network.@shekhars-li commented on GitHub (Feb 27, 2024):
Wow that was quick! Thank you so much @justinh-rahb! Really appreciate it. This fixed the issue. Although I am not sure why. Seems like litellm could get and send response fine with the openai-proxy url?
@justinh-rahb commented on GitHub (Feb 27, 2024):
Your guess is as good as mine, I didn't feel like debugging it so just went with what worked at the time 😬 I've since remove the LiteLLM container from my WebUI stack because it's now built-in to the project and can be configured from the Settings > Models interface.
@shekhars-li commented on GitHub (Feb 27, 2024):
Thanks a lot for your help! :)