mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-10 15:54:15 -05:00
bug: When running litellm in docker it won't connect because it doesn't run on localhost directly #919
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @profikid on GitHub (May 14, 2024).
When running litellm in docker it won't connect because my base url is not localhost
Setting LITELLM_PROXY_HOST doesnt work,
Culprit:
90503be2ed/backend/apps/litellm/main.py (L224)90503be2ed/backend/apps/litellm/main.py (L332)Possible solution:
f"{LITELLM_PROXY_HOST}:{LITELLM_PROXY_PORT}/v1"
@justinh-rahb commented on GitHub (May 14, 2024):
This is for setting the host of the internal LiteLLM running inside WebUI, not for connecting to an external LiteLLM. Just add it as an "OpenAI" connection.