502 Server Error: Bad Gateway when downloading Ollama model via Open WebUI #769

Closed
opened 2025-11-11 14:30:53 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @GeorgesAlkhouri on GitHub (Apr 30, 2024).

Bug Report

Description

Bug Summary:
After following the standard installation instructions by copying the docker-compose.yaml file from the repository and executing docker compose up, I encounter a "502 Server Error: Bad Gateway" when trying to download an Ollama model through the Open WebUI interface.

Steps to Reproduce:

  1. Copy the docker-compose.yaml file from the official repository.
  2. Run docker compose up to start the services.
  3. Access the Open WebUI interface.
  4. Attempt to download an Ollama model.
  5. Encounter a 502 Bad Gateway error at URL http://ollama:11434/api/pull.

Expected Behavior:
The Ollama model should download successfully without server errors.

Actual Behavior:
A "502 Server Error: Bad Gateway" is displayed when attempting to download the model, indicating an issue with reaching the Ollama service.

Environment

  • Open WebUI Version: v0.1.122

  • Ollama (if applicable): v0.1.32

  • Operating System: Ubuntu 20.04

  • docker/docker compose: 26.1.0 / v2.26.1

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:

start.51f50eed.js:1 
        
        
       POST http://deshgslucy03p:3000/ollama/api/pull/0 500 (Internal Server Error)
window.fetch @ start.51f50eed.js:1
ot @ index.567f3fd4.js:14
$ @ 2.336e8072.js:33
ee @ 2.336e8072.js:40
2.336e8072.js:35 {detail: 'Ollama: 502 Server Error: Bad Gateway for url: http://ollama:11434/api/pull'}detail: "Ollama: 502 Server Error: Bad Gateway for url: http://ollama:11434/api/pull"[[Prototype]]: Objectconstructor: ƒ Object()hasOwnProperty: ƒ hasOwnProperty()isPrototypeOf: ƒ isPrototypeOf()propertyIsEnumerable: ƒ propertyIsEnumerable()toLocaleString: ƒ toLocaleString()toString: ƒ toString()valueOf: ƒ valueOf()__defineGetter__: ƒ __defineGetter__()__defineSetter__: ƒ __defineSetter__()__lookupGetter__: ƒ __lookupGetter__()__lookupSetter__: ƒ __lookupSetter__()__proto__: (...)get __proto__: ƒ __proto__()set __proto__: ƒ __proto__()
2.336e8072.js:35 Ollama: 502 Server Error: Bad Gateway for url: http://ollama:11434/api/pull

Docker Container Logs:

open-webui  | INFO:apps.ollama.main:get_all_models()
ollama      | [GIN] 2024/04/30 - 07:53:12 | 200 |     151.677µs |      172.20.0.3 | GET      "/api/tags"
open-webui  | INFO:     10.15.153.127:54305 - "GET /ollama/urls HTTP/1.1" 200 OK
open-webui  | INFO:apps.ollama.main:get_all_models()
ollama      | [GIN] 2024/04/30 - 07:53:12 | 200 |     123.818µs |      172.20.0.3 | GET      "/api/tags"
ollama      | [GIN] 2024/04/30 - 07:53:12 | 200 |      40.812µs |      172.20.0.3 | GET      "/api/version"
open-webui  | INFO:     10.15.153.127:54305 - "GET /ollama/api/version HTTP/1.1" 200 OK
open-webui  | INFO:     10.15.153.127:54305 - "GET /litellm/api/model/info HTTP/1.1" 200 OK
open-webui  | INFO:apps.ollama.main:get_all_models()
ollama      | [GIN] 2024/04/30 - 07:53:15 | 200 |     144.804µs |      172.20.0.3 | GET      "/api/tags"
open-webui  | INFO:apps.ollama.main:url: http://ollama:11434
open-webui  | ERROR:apps.ollama.main:502 Server Error: Bad Gateway for url: http://ollama:11434/api/pull
open-webui  | Traceback (most recent call last):
open-webui  |   File "/app/backend/apps/ollama/main.py", line 311, in pull_model
open-webui  |     return await run_in_threadpool(get_request)
open-webui  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 42, in run_in_threadpool
open-webui  |     return await anyio.to_thread.run_sync(func, *args)
open-webui  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
open-webui  |     return await get_async_backend().run_sync_in_worker_thread(
open-webui  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
open-webui  |     return await future
open-webui  |            ^^^^^^^^^^^^
open-webui  |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
open-webui  |     result = context.run(func, *args)
open-webui  |              ^^^^^^^^^^^^^^^^^^^^^^^^
open-webui  |   File "/app/backend/apps/ollama/main.py", line 308, in get_request
open-webui  |     raise e
open-webui  |   File "/app/backend/apps/ollama/main.py", line 300, in get_request
open-webui  |     r.raise_for_status()
open-webui  |   File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
open-webui  |     raise HTTPError(http_error_msg, response=self)
open-webui  | requests.exceptions.HTTPError: 502 Server Error: Bad Gateway for url: http://ollama:11434/api/pull

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

Used Docker installation by executing docker compose up

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @GeorgesAlkhouri on GitHub (Apr 30, 2024). # Bug Report ## Description **Bug Summary:** After following the standard installation instructions by copying the docker-compose.yaml file from the repository and executing docker compose up, I encounter a "502 Server Error: Bad Gateway" when trying to download an Ollama model through the Open WebUI interface. **Steps to Reproduce:** 1. Copy the docker-compose.yaml file from the official repository. 2. Run docker compose up to start the services. 3. Access the Open WebUI interface. 4. Attempt to download an Ollama model. 5. Encounter a 502 Bad Gateway error at URL http://ollama:11434/api/pull. **Expected Behavior:** The Ollama model should download successfully without server errors. **Actual Behavior:** A "502 Server Error: Bad Gateway" is displayed when attempting to download the model, indicating an issue with reaching the Ollama service. ## Environment - **Open WebUI Version:** v0.1.122 - **Ollama (if applicable):** v0.1.32 - **Operating System:** Ubuntu 20.04 - **docker/docker compose**: 26.1.0 / v2.26.1 ## Reproduction Details **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. ## Logs and Screenshots **Browser Console Logs:** ``` start.51f50eed.js:1 POST http://deshgslucy03p:3000/ollama/api/pull/0 500 (Internal Server Error) window.fetch @ start.51f50eed.js:1 ot @ index.567f3fd4.js:14 $ @ 2.336e8072.js:33 ee @ 2.336e8072.js:40 2.336e8072.js:35 {detail: 'Ollama: 502 Server Error: Bad Gateway for url: http://ollama:11434/api/pull'}detail: "Ollama: 502 Server Error: Bad Gateway for url: http://ollama:11434/api/pull"[[Prototype]]: Objectconstructor: ƒ Object()hasOwnProperty: ƒ hasOwnProperty()isPrototypeOf: ƒ isPrototypeOf()propertyIsEnumerable: ƒ propertyIsEnumerable()toLocaleString: ƒ toLocaleString()toString: ƒ toString()valueOf: ƒ valueOf()__defineGetter__: ƒ __defineGetter__()__defineSetter__: ƒ __defineSetter__()__lookupGetter__: ƒ __lookupGetter__()__lookupSetter__: ƒ __lookupSetter__()__proto__: (...)get __proto__: ƒ __proto__()set __proto__: ƒ __proto__() 2.336e8072.js:35 Ollama: 502 Server Error: Bad Gateway for url: http://ollama:11434/api/pull ``` **Docker Container Logs:** ``` open-webui | INFO:apps.ollama.main:get_all_models() ollama | [GIN] 2024/04/30 - 07:53:12 | 200 | 151.677µs | 172.20.0.3 | GET "/api/tags" open-webui | INFO: 10.15.153.127:54305 - "GET /ollama/urls HTTP/1.1" 200 OK open-webui | INFO:apps.ollama.main:get_all_models() ollama | [GIN] 2024/04/30 - 07:53:12 | 200 | 123.818µs | 172.20.0.3 | GET "/api/tags" ollama | [GIN] 2024/04/30 - 07:53:12 | 200 | 40.812µs | 172.20.0.3 | GET "/api/version" open-webui | INFO: 10.15.153.127:54305 - "GET /ollama/api/version HTTP/1.1" 200 OK open-webui | INFO: 10.15.153.127:54305 - "GET /litellm/api/model/info HTTP/1.1" 200 OK open-webui | INFO:apps.ollama.main:get_all_models() ollama | [GIN] 2024/04/30 - 07:53:15 | 200 | 144.804µs | 172.20.0.3 | GET "/api/tags" open-webui | INFO:apps.ollama.main:url: http://ollama:11434 open-webui | ERROR:apps.ollama.main:502 Server Error: Bad Gateway for url: http://ollama:11434/api/pull open-webui | Traceback (most recent call last): open-webui | File "/app/backend/apps/ollama/main.py", line 311, in pull_model open-webui | return await run_in_threadpool(get_request) open-webui | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ open-webui | File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 42, in run_in_threadpool open-webui | return await anyio.to_thread.run_sync(func, *args) open-webui | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ open-webui | File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync open-webui | return await get_async_backend().run_sync_in_worker_thread( open-webui | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ open-webui | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread open-webui | return await future open-webui | ^^^^^^^^^^^^ open-webui | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run open-webui | result = context.run(func, *args) open-webui | ^^^^^^^^^^^^^^^^^^^^^^^^ open-webui | File "/app/backend/apps/ollama/main.py", line 308, in get_request open-webui | raise e open-webui | File "/app/backend/apps/ollama/main.py", line 300, in get_request open-webui | r.raise_for_status() open-webui | File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status open-webui | raise HTTPError(http_error_msg, response=self) open-webui | requests.exceptions.HTTPError: 502 Server Error: Bad Gateway for url: http://ollama:11434/api/pull ``` **Screenshots (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Installation Method Used Docker installation by executing docker compose up ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@GeorgesAlkhouri commented on GitHub (Apr 30, 2024):

So my curreny solution is to connect to ollama via the network_mode: host.


services:
  ollama:
    image: ollama/ollama
    container_name: ollama
    volumes:
      - ollama_data:/root/.ollama
    ports:
      - "11434:11434"
    restart: unless-stopped

  open-webui:
    image: ghcr.io/open-webui/open-webui:${WEBUI_DOCKER_TAG-main}
    container_name: open-webui
    depends_on:
      - ollama
    volumes:
      - open-webui:/app/backend/data
    environment:
      - OLLAMA_BASE_URL=http://127.0.0.1:11434
    network_mode: host
    restart: always

volumes:
  ollama_data: {}
  open-webui: {}
@GeorgesAlkhouri commented on GitHub (Apr 30, 2024): So my curreny solution is to connect to ollama via the `network_mode: host`. ```yaml services: ollama: image: ollama/ollama container_name: ollama volumes: - ollama_data:/root/.ollama ports: - "11434:11434" restart: unless-stopped open-webui: image: ghcr.io/open-webui/open-webui:${WEBUI_DOCKER_TAG-main} container_name: open-webui depends_on: - ollama volumes: - open-webui:/app/backend/data environment: - OLLAMA_BASE_URL=http://127.0.0.1:11434 network_mode: host restart: always volumes: ollama_data: {} open-webui: {} ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#769