bug: When running litellm in docker it won't connect because it doesn't run on localhost directly #919

Closed
opened 2025-11-11 14:33:43 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @profikid on GitHub (May 14, 2024).

When running litellm in docker it won't connect because my base url is not localhost

Setting LITELLM_PROXY_HOST doesnt work,

Culprit:
90503be2ed/backend/apps/litellm/main.py (L224)

90503be2ed/backend/apps/litellm/main.py (L332)

Possible solution:

f"{LITELLM_PROXY_HOST}:{LITELLM_PROXY_PORT}/v1"

Originally created by @profikid on GitHub (May 14, 2024). When running litellm in docker it won't connect because my base url is not localhost Setting LITELLM_PROXY_HOST doesnt work, Culprit: https://github.com/open-webui/open-webui/blame/90503be2edef1a1f7ce2074286b6316d5cb8868a/backend/apps/litellm/main.py#L224 https://github.com/open-webui/open-webui/blame/90503be2edef1a1f7ce2074286b6316d5cb8868a/backend/apps/litellm/main.py#L332 Possible solution: f"{LITELLM_PROXY_HOST}:{LITELLM_PROXY_PORT}/v1"
Author
Owner

@justinh-rahb commented on GitHub (May 14, 2024):

This is for setting the host of the internal LiteLLM running inside WebUI, not for connecting to an external LiteLLM. Just add it as an "OpenAI" connection.

@justinh-rahb commented on GitHub (May 14, 2024): This is for setting the host of the internal LiteLLM running _inside_ WebUI, not for connecting to an external LiteLLM. Just add it as an "OpenAI" connection.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#919