[GH-ISSUE #5294] OpenWebUI Hangs for Several Minutes Without Error After Ollama URL Becomes Unreachable #13931

Closed
opened 2026-04-19 20:27:57 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @ForestRealms on GitHub (Sep 9, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/5294

Bug Report

Installation Method

Docker

Environment

  • Open WebUI Version: v0.3.21

  • Operating System: OpenCloudOS 8

  • Browser (if applicable): Chrome 128.0.6613.120

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

When thr Ollama URL in the admin settings panel under External Connections is unreachable or invalid (or OpenAI API URL is unreachable for some resons, the same phenomenon occurs), OpenWebUI should provide feedback or error messages quickly, or display the UI first, rather than trying to test the URL for reachability and then display the UI.

Actual Behavior:

When setting an Ollama URL or OpenAI API URL that is not available or invalid, the OpenWebUI takes an excessively long time (several minutes) to load in the External Connections section under the admin settings panel. During this time, the UI is unresponsive, and no immediate error or feedback is provided in UI to indicate that the connection has failed.

Description

Bug Summary:
When configuring an unavailable or incorrect Ollama URL, OpenWebUI takes an excessive amount of time to respond in the admin panel's External Connections section, resulting in loading times of several minutes without proper error feedback.

Reproduction Details

Steps to Reproduce:

  1. Navigate to the OpenWebUI admin settings panel.
  2. Go to the External Connections section.
  3. Set an Ollama URL that is valid and available.
  4. Shut down the server hosting the Ollama service, making the URL invalid.
  5. Re-login to (or restart) OpenWebUI and navigate back to the External Connections section.
  6. Observe that OpenWebUI takes several minutes to load or respond.
  7. No error message is displayed immediately, and the page is in a loading state for a long time. After a long wait, the page is loaded.

Note: A similar phenomenon occurs when the OpenAI API is not reachable, such as when the server cannot connect to api.openai.com

Logs and Screenshots

Browser Console Logs:

Failed to load resource: the server responded with a status of 500 (Internal Server Error)
Connections.svelte:156 Uncaught (in promise) OpenAI: Network Problem
+layout.svelte:84 user-count Object
+layout.svelte:89 usage Objectmodels: [][[Prototype]]: Object
+layout.svelte:65 connected GKttIkReJLv7qiinAAAI
version:1 
        
        
       Failed to load resource: the server responded with a status of 500 (Internal Server Error)
index.ts:153 {detail: 'WebUI could not connect to Ollama'}detail: "WebUI could not connect to Ollama"[[Prototype]]: Object
+layout.svelte:84 user-count {count: 1}

Docker Container Logs:

ERROR [open_webui.apps.openai.main] HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/models (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f34a5f2ed10>: Failed to establish a new connection: [Errno 101] Network is unreachable'))
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 196, in _new_conn
    sock = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection
    raise err
  File "/usr/local/lib/python3.11/site-packages/urllib3/util/connection.py", line 73, in create_connection
    sock.connect(sa)
OSError: [Errno 101] Network is unreachable

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 789, in urlopen
    response = self._make_request(
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 490, in _make_request
    raise new_e
  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 466, in _make_request
    self._validate_conn(conn)
  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 1095, in _validate_conn
    conn.connect()
  File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 615, in connect
    self.sock = sock = self._new_conn()
                       ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 211, in _new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f34a5f2ed10>: Failed to establish a new connection: [Errno 101] Network is unreachable

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 667, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 843, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/urllib3/util/retry.py", line 519, in increment
    raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/models (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f34a5f2ed10>: Failed to establish a new connection: [Errno 101] Network is unreachable'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/backend/open_webui/apps/openai/main.py", line 322, in get_models
    r = requests.request(method="GET", url=f"{url}/models", headers=headers)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/requests/api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 700, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/models (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f34a5f2ed10>: Failed to establish a new connection: [Errno 101] Network is unreachable'))
ERROR [open_webui.apps.ollama.main] Connection error: 
ERROR [open_webui.apps.ollama.main] Connection error: 
ERROR [open_webui.apps.ollama.main] Connection error: 
INFO:     ('221.212.116.9', 61378) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket" [accepted]
INFO:     connection closed
INFO:     ('221.212.116.9', 63547) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket" [accepted]
INFO:     connection closed
INFO:     connection closed
INFO:     ('221.212.116.9', 1210) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket" [accepted]
INFO:     connection closed
INFO:     ('221.212.116.9', 3693) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket" [accepted]
INFO:     connection open
INFO:     221.212.116.9:58970 - "GET /openai/models/0 HTTP/1.1" 500 Internal Server Error
INFO:     221.212.116.9:58134 - "GET /ollama/urls HTTP/1.1" 200 OK
ERROR:    Exception in ASGI application
  + Exception Group Traceback (most recent call last):
  |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 87, in collapse_excgroups
  |     yield
  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 190, in __call__
  |     async with anyio.create_task_group() as task_group:
  |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 680, in __aexit__
  |     raise BaseExceptionGroup(
  | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi    |     result = await app(  # type: ignore[func-returns-value]
    |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    |     return await self.app(scope, receive, send)
    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    |     await super().__call__(scope, receive, send)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    |     await self.middleware_stack(scope, receive, send)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    |     raise exc
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    |     await self.app(scope, receive, _send)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    |     with collapse_excgroups():
    |   File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    |     self.gen.throw(typ, value, traceback)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    |     raise exc
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    |     response = await self.dispatch_func(request, call_next)
    |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/app/backend/open_webui/main.py", line 809, in update_embedding_function
    |     response = await call_next(request)
    |                ^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 166, in call_next
    |     raise RuntimeError("No response returned.")
    | RuntimeError: No response returned.
    +------------------------------------

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    with collapse_excgroups():
  File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/open_webui/main.py", line 809, in update_embedding_function
    response = await call_next(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 166, in call_next
    raise RuntimeError("No response returned.")
RuntimeError: No response returned.

Screenshots/Screen Recordings (if applicable):
image

Additional Information

  • This issue may be related to timeout settings or error handling when attempting to connect to an unreachable external service (Ollama).
  • It would be helpful to provide faster feedback or error messages when the connection to Ollama fails.
Originally created by @ForestRealms on GitHub (Sep 9, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/5294 # Bug Report ## Installation Method Docker ## Environment - **Open WebUI Version:** v0.3.21 - **Operating System:** OpenCloudOS 8 - **Browser (if applicable):** Chrome 128.0.6613.120 **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: When thr Ollama URL in the admin settings panel under External Connections is unreachable or invalid (or OpenAI API URL is unreachable for some resons, the same phenomenon occurs), OpenWebUI should provide feedback or error messages quickly, or display the UI first, rather than trying to test the URL for reachability and then display the UI. ## Actual Behavior: When setting an Ollama URL or OpenAI API URL that is not available or invalid, the OpenWebUI takes an excessively long time (several minutes) to load in the External Connections section under the admin settings panel. During this time, the UI is unresponsive, and no immediate error or feedback is provided in UI to indicate that the connection has failed. ## Description **Bug Summary:** When configuring an unavailable or incorrect Ollama URL, OpenWebUI takes an excessive amount of time to respond in the admin panel's External Connections section, resulting in loading times of several minutes without proper error feedback. ## Reproduction Details **Steps to Reproduce:** 1. Navigate to the OpenWebUI admin settings panel. 2. Go to the **External Connections** section. 3. Set an Ollama URL that is valid and available. 4. Shut down the server hosting the Ollama service, making the URL invalid. 5. Re-login to (or restart) OpenWebUI and navigate back to the **External Connections** section. 6. Observe that OpenWebUI takes several minutes to load or respond. 7. No error message is displayed immediately, and the page is in a loading state for a long time. After a long wait, the page is loaded. **Note: A similar phenomenon occurs when the OpenAI API is not reachable, such as when the server cannot connect to api.openai.com** ## Logs and Screenshots **Browser Console Logs:** ``` Failed to load resource: the server responded with a status of 500 (Internal Server Error) Connections.svelte:156 Uncaught (in promise) OpenAI: Network Problem +layout.svelte:84 user-count Object +layout.svelte:89 usage Objectmodels: [][[Prototype]]: Object +layout.svelte:65 connected GKttIkReJLv7qiinAAAI version:1 Failed to load resource: the server responded with a status of 500 (Internal Server Error) index.ts:153 {detail: 'WebUI could not connect to Ollama'}detail: "WebUI could not connect to Ollama"[[Prototype]]: Object +layout.svelte:84 user-count {count: 1} ``` **Docker Container Logs:** ``` ERROR [open_webui.apps.openai.main] HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/models (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f34a5f2ed10>: Failed to establish a new connection: [Errno 101] Network is unreachable')) Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 196, in _new_conn sock = connection.create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection raise err File "/usr/local/lib/python3.11/site-packages/urllib3/util/connection.py", line 73, in create_connection sock.connect(sa) OSError: [Errno 101] Network is unreachable The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 789, in urlopen response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 490, in _make_request raise new_e File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 466, in _make_request self._validate_conn(conn) File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 1095, in _validate_conn conn.connect() File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 615, in connect self.sock = sock = self._new_conn() ^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 211, in _new_conn raise NewConnectionError( urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPSConnection object at 0x7f34a5f2ed10>: Failed to establish a new connection: [Errno 101] Network is unreachable The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 667, in send resp = conn.urlopen( ^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 843, in urlopen retries = retries.increment( ^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/urllib3/util/retry.py", line 519, in increment raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/models (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f34a5f2ed10>: Failed to establish a new connection: [Errno 101] Network is unreachable')) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/app/backend/open_webui/apps/openai/main.py", line 322, in get_models r = requests.request(method="GET", url=f"{url}/models", headers=headers) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/requests/api.py", line 59, in request return session.request(method=method, url=url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 703, in send r = adapter.send(request, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 700, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/models (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f34a5f2ed10>: Failed to establish a new connection: [Errno 101] Network is unreachable')) ERROR [open_webui.apps.ollama.main] Connection error: ERROR [open_webui.apps.ollama.main] Connection error: ERROR [open_webui.apps.ollama.main] Connection error: INFO: ('221.212.116.9', 61378) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket" [accepted] INFO: connection closed INFO: ('221.212.116.9', 63547) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket" [accepted] INFO: connection closed INFO: connection closed INFO: ('221.212.116.9', 1210) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket" [accepted] INFO: connection closed INFO: ('221.212.116.9', 3693) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket" [accepted] INFO: connection open INFO: 221.212.116.9:58970 - "GET /openai/models/0 HTTP/1.1" 500 Internal Server Error INFO: 221.212.116.9:58134 - "GET /ollama/urls HTTP/1.1" 200 OK ERROR: Exception in ASGI application + Exception Group Traceback (most recent call last): | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 87, in collapse_excgroups | yield | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 190, in __call__ | async with anyio.create_task_group() as task_group: | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 680, in __aexit__ | raise BaseExceptionGroup( | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi | result = await app( # type: ignore[func-returns-value] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__ | return await self.app(scope, receive, send) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__ | await super().__call__(scope, receive, send) | File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__ | await self.middleware_stack(scope, receive, send) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__ | raise exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__ | await self.app(scope, receive, _send) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ | with collapse_excgroups(): | File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ | self.gen.throw(typ, value, traceback) | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups | raise exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ | response = await self.dispatch_func(request, call_next) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/app/backend/open_webui/main.py", line 809, in update_embedding_function | response = await call_next(request) | ^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 166, in call_next | raise RuntimeError("No response returned.") | RuntimeError: No response returned. +------------------------------------ During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi result = await app( # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__ return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__ await super().__call__(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__ await self.middleware_stack(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__ raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__ await self.app(scope, receive, _send) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/main.py", line 809, in update_embedding_function response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 166, in call_next raise RuntimeError("No response returned.") RuntimeError: No response returned. ``` **Screenshots/Screen Recordings (if applicable):** ![image](https://github.com/user-attachments/assets/70b34442-dd78-48a0-ac41-6dfdea9851c1) ## Additional Information - This issue may be related to timeout settings or error handling when attempting to connect to an unreachable external service (Ollama). - It would be helpful to provide faster feedback or error messages when the connection to Ollama fails.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#13931