mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #2259] bug: When running litellm in docker it won't connect because it doesn't run on localhost directly #12815
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @profikid on GitHub (May 14, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2259
When running litellm in docker it won't connect because my base url is not localhost
Setting LITELLM_PROXY_HOST doesnt work,
Culprit:
90503be2ed/backend/apps/litellm/main.py (L224)90503be2ed/backend/apps/litellm/main.py (L332)Possible solution:
f"{LITELLM_PROXY_HOST}:{LITELLM_PROXY_PORT}/v1"
@justinh-rahb commented on GitHub (May 14, 2024):
This is for setting the host of the internal LiteLLM running inside WebUI, not for connecting to an external LiteLLM. Just add it as an "OpenAI" connection.