[GH-ISSUE #16433] issue: Can't connect litellm-proxy to open-webui container #56568

Closed
opened 2026-05-05 19:43:11 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @lackos on GitHub (Aug 10, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/16433

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

v0.5.20

Ollama Version (if applicable)

No response

Operating System

Manjaro Linux 25.0.6

Browser (if applicable)

Firefox 141.0

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

Connection established between the litellm-proxy container and open-webui container.

  1. Navigate to setting up external connection in Settings -> Connections -> Add Connection.
  2. Add internal URL for litellm-proxy - `http://litellm:4000
  3. Add API master key for litellm-proxy - sk-12345
  4. Save connection
  5. Litellm models are available in Open-WebUI.

Actual Behavior

  1. Navigate to setting up external connection in Settings -> Connections -> Add Connection.
  2. Add internal URL for litellm-proxy - `http://litellm:4000
  3. Add API master key for litellm-proxy - sk-12345
  4. Click 'Verify Connection' and get 'OpenAI: Network Problem'
  5. After saving no models from the connection are listed.

Steps to Reproduce

  1. Install docker version 28.3.2
  2. Start docker service
  3. In a directory have the following files docker-compose.yaml, .env, config.yaml

docker-compose.yaml

version: '3.8'

services:
  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    container_name: open-webui
    ports:
      - "8080:8080"
    environment:
      - GLOBAL_LOG_LEVEL=DEBUG
      - OLLAMA_BASE_URL=http://host.docker.internal:11434
    extra_hosts:
      - "host.docker.internal:host-gateway"
    volumes:
      - open-webui-data:/app/backend/data
    restart: unless-stopped

  
  litellm:
    image: ghcr.io/berriai/litellm:main-stable
    container_name: litellm-proxy
    environment:
      - GEMINI_API_KEY=${GEMINI_API_KEY}
      - LITELLM_MASTER_KEY=${LITELLM_MASTER_KEY}
    ports:
      - "4000:4000"
    volumes:
      - ./config.yaml:/app/config.yaml:z
    command: 
      - "--config"
      - "/app/config.yaml"
      - "--detailed_debug"
    restart: unless-stopped

volumes:
  open-webui-data:

.env

#Your Google AI API Key () working value is unimportant for establishing a connection between open-webui and lite-llm
GEMINI_API_KEY=abcdef

#Your master key for authorizing requests to the LiteLLM proxy
LITELLM_MASTER_KEY=sk-12345

config.yaml

model_list:
  - model_name: "gemini-2.5-pro" 
    litellm_params:
      model: "gemini/gemini-2.5-pro" 
  1. Build docker containers docker-compose up
  2. Wait for containers to build and services to start.
  3. Test that the open-webui container can connect connect to litellm container with the url docker-compose exec open-webui curl http://litellm:4000/models -H "Authorization: Bearer sk-12345"
  4. Navigate to setting up external connection in Settings -> Connections -> Add Connection.
  5. Add internal URL for litellm-proxy - `http://litellm:4000
  6. Add API master key for litellm-proxy - sk-12345
  7. Click 'Verify Connection' and get 'OpenAI: Network Problem'
  8. After saving no models from the connection are listed.

Logs & Screenshots

open-webui container logs

2025-08-10 09:29:44.954 | DEBUG | aiocache.base:set:280 - SET open_webui.routers.ollamaget_all_models(<starlette.requests.Request object at 0x7f351c39fc90>,)[('user', UserModel(id='a8c88fff-23cd-44f7-a49c-a35193de762e', name='User', email='user@gmail.com', role='admin', profile_image_url=, last_active_at=1754818179, updated_at=1754568415, created_at=1754568415, api_key=None, settings=UserSettings(ui={'version': '0.5.20', 'directConnections': {'OPENAI_API_BASE_URLS': ['http://litellm:4000'], 'OPENAI_API_KEYS': ['sk-12345'], 'OPENAI_API_CONFIGS': {'0': {'enable': True, 'prefix_id': 'lite', 'model_ids': []}}}, 'models': ['deepseek-r1:32b']}), info=None, oauth_sub=None))] 1 (0.0000)s - {}
2025-08-10 09:29:44.956 | DEBUG | open_webui.utils.models:get_all_models:220 - get_all_models() returned 5 models - {}
2025-08-10 09:29:44.957 | DEBUG | open_webui.main:get_models:971 - /api/models returned filtered models accessible to the user: ["deepseek-coder-v2:16b", "deepseek-r1:32b", "deepseek-r1:14b", "deepseek-r1:latest", "arena-model"] - {}
2025-08-10 09:29:44.958 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.22.0.1:53480 - "GET /api/models HTTP/1.1" 200 - {}
2025-08-10 09:29:44.990 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.22.0.1:53480 - "POST /api/v1/users/user/settings/update HTTP/1.1" 200 - {}

Verify connection between containers

$ docker-compose exec open-webui curl http://litellm:4000/models -H "Authorization: Bearer sk-12345"
> {"data":[{"id":"gemini-2.5-pro","object":"model","created":1677610602,"owned_by":"openai"}],"object":"list"}

Screenshot of error when verifying connection

Image

Additional Information

No response

Originally created by @lackos on GitHub (Aug 10, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/16433 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version v0.5.20 ### Ollama Version (if applicable) _No response_ ### Operating System Manjaro Linux 25.0.6 ### Browser (if applicable) Firefox 141.0 ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior Connection established between the litellm-proxy container and open-webui container. 1. Navigate to setting up external connection in Settings -> Connections -> Add Connection. 2. Add internal URL for litellm-proxy - `http://litellm:4000 3. Add API master key for litellm-proxy - sk-12345 4. Save connection 5. Litellm models are available in Open-WebUI. ### Actual Behavior 1. Navigate to setting up external connection in Settings -> Connections -> Add Connection. 2. Add internal URL for litellm-proxy - `http://litellm:4000 3. Add API master key for litellm-proxy - sk-12345 4. Click 'Verify Connection' and get 'OpenAI: Network Problem' 5. After saving no models from the connection are listed. ### Steps to Reproduce 1. Install docker version 28.3.2 2. Start docker service 3. In a directory have the following files docker-compose.yaml, .env, config.yaml ### docker-compose.yaml ``` version: '3.8' services: open-webui: image: ghcr.io/open-webui/open-webui:main container_name: open-webui ports: - "8080:8080" environment: - GLOBAL_LOG_LEVEL=DEBUG - OLLAMA_BASE_URL=http://host.docker.internal:11434 extra_hosts: - "host.docker.internal:host-gateway" volumes: - open-webui-data:/app/backend/data restart: unless-stopped litellm: image: ghcr.io/berriai/litellm:main-stable container_name: litellm-proxy environment: - GEMINI_API_KEY=${GEMINI_API_KEY} - LITELLM_MASTER_KEY=${LITELLM_MASTER_KEY} ports: - "4000:4000" volumes: - ./config.yaml:/app/config.yaml:z command: - "--config" - "/app/config.yaml" - "--detailed_debug" restart: unless-stopped volumes: open-webui-data: ``` ### .env ``` #Your Google AI API Key () working value is unimportant for establishing a connection between open-webui and lite-llm GEMINI_API_KEY=abcdef #Your master key for authorizing requests to the LiteLLM proxy LITELLM_MASTER_KEY=sk-12345 ``` ### config.yaml ``` model_list: - model_name: "gemini-2.5-pro" litellm_params: model: "gemini/gemini-2.5-pro" ``` 4. Build docker containers `docker-compose up` 5. Wait for containers to build and services to start. 6. Test that the open-webui container can connect connect to litellm container with the url `docker-compose exec open-webui curl http://litellm:4000/models -H "Authorization: Bearer sk-12345"` 7. Navigate to setting up external connection in Settings -> Connections -> Add Connection. 8. Add internal URL for litellm-proxy - `http://litellm:4000 9. Add API master key for litellm-proxy - sk-12345 10. Click 'Verify Connection' and get 'OpenAI: Network Problem' 11. After saving no models from the connection are listed. ### Logs & Screenshots ### open-webui container logs 2025-08-10 09:29:44.954 | DEBUG | aiocache.base:set:280 - SET open_webui.routers.ollamaget_all_models(<starlette.requests.Request object at 0x7f351c39fc90>,)[('user', UserModel(id='a8c88fff-23cd-44f7-a49c-a35193de762e', name='User', email='user@gmail.com', role='admin', profile_image_url=, last_active_at=1754818179, updated_at=1754568415, created_at=1754568415, api_key=None, settings=UserSettings(ui={'version': '0.5.20', 'directConnections': {'OPENAI_API_BASE_URLS': ['http://litellm:4000'], 'OPENAI_API_KEYS': ['sk-12345'], 'OPENAI_API_CONFIGS': {'0': {'enable': True, 'prefix_id': 'lite', 'model_ids': []}}}, 'models': ['deepseek-r1:32b']}), info=None, oauth_sub=None))] 1 (0.0000)s - {} 2025-08-10 09:29:44.956 | DEBUG | open_webui.utils.models:get_all_models:220 - get_all_models() returned 5 models - {} 2025-08-10 09:29:44.957 | DEBUG | open_webui.main:get_models:971 - /api/models returned filtered models accessible to the user: ["deepseek-coder-v2:16b", "deepseek-r1:32b", "deepseek-r1:14b", "deepseek-r1:latest", "arena-model"] - {} 2025-08-10 09:29:44.958 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.22.0.1:53480 - "GET /api/models HTTP/1.1" 200 - {} 2025-08-10 09:29:44.990 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.22.0.1:53480 - "POST /api/v1/users/user/settings/update HTTP/1.1" 200 - {} ### Verify connection between containers ``` $ docker-compose exec open-webui curl http://litellm:4000/models -H "Authorization: Bearer sk-12345" > {"data":[{"id":"gemini-2.5-pro","object":"model","created":1677610602,"owned_by":"openai"}],"object":"list"} ``` ### Screenshot of error when verifying connection <img width="2510" height="1137" alt="Image" src="https://github.com/user-attachments/assets/b27f4c40-fd9e-436c-991c-57a761b0572e" /> ### Additional Information _No response_
GiteaMirror added the bug label 2026-05-05 19:43:11 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#56568