[GH-ISSUE #24153] issue: v0.9.2 -> can't get model list with no_proxy in an company proxy environment #58878

Closed
opened 2026-05-06 00:19:35 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @somera on GitHub (Apr 26, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/24153

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

0.9.2

Ollama Version (if applicable)

0.21.2

Operating System

Ubuntu 24.04.4

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

I updated from 0.8.12 to 0.9.2 and my instance can't get any modell list:

2026-04-26 11:35:23.951 | ERROR    | open_webui.routers.openai:get_models:643 - Unexpected error: 
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 539, in start
    message, payload = await protocol.read()  # type: ignore[union-attr]
                             │        └ <function DataQueue.read at 0x7f9ba2e35e40>
                             └ <aiohttp.client_proto.ResponseHandler object at 0x7f9b24289da0>
  File "/usr/local/lib/python3.11/site-packages/aiohttp/streams.py", line 707, in read
    await self._waiter
          │    └ None
          └ <aiohttp.client_proto.ResponseHandler object at 0x7f9b24289da0>
asyncio.exceptions.CancelledError
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/usr/local/lib/python3.11/site-packages/uvicorn/__main__.py", line 4, in <module>
    uvicorn.main()
    │       └ <Command main>
    └ <module 'uvicorn' from '/usr/local/lib/python3.11/site-packages/uvicorn/__init__.py'>
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1514, in __call__
    return self.main(*args, **kwargs)
           │    │     │       └ {}
           │    │     └ ()
           │    └ <function Command.main at 0x7f9ba52c3240>
           └ <Command main>
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1435, in main
    rv = self.invoke(ctx)
         │    │      └ <click.core.Context object at 0x7f9ba5fbcc50>
         │    └ <function Command.invoke at 0x7f9ba52c2f20>
         └ <Command main>
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1298, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           │   │      │    │           │   └ {'host': '0.0.0.0', 'port': 8080, 'forwarded_allow_ips': '*', 'workers': 1, 'app': 'open_webui.main:app', 'uds': None, 'fd': ...
           │   │      │    │           └ <click.core.Context object at 0x7f9ba5fbcc50>
           │   │      │    └ <function main at 0x7f9ba50982c0>
           │   │      └ <Command main>
           │   └ <function Context.invoke at 0x7f9ba52c2160>
           └ <click.core.Context object at 0x7f9ba5fbcc50>
  File "/usr/local/lib/python3.11/site-packages/click/core.py", line 853, in invoke
    return callback(*args, **kwargs)
           │         │       └ {'host': '0.0.0.0', 'port': 8080, 'forwarded_allow_ips': '*', 'workers': 1, 'app': 'open_webui.main:app', 'uds': None, 'fd': ...
           │         └ ()
           └ <function main at 0x7f9ba50982c0>
  File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 433, in main
    run(
    └ <function run at 0x7f9ba5159ee0>
  File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 606, in run
    server.run()
    │      └ <function Server.run at 0x7f9ba5159760>
    └ <uvicorn.server.Server object at 0x7f9ba526e6d0>
  File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 75, in run
    return asyncio_run(self.serve(sockets=sockets), loop_factory=self.config.get_loop_factory())
           │           │    │             │                      │    │      └ <function Config.get_loop_factory at 0x7f9ba5290400>
           │           │    │             │                      │    └ <uvicorn.config.Config object at 0x7f9ba509d850>
           │           │    │             │                      └ <uvicorn.server.Server object at 0x7f9ba526e6d0>
           │           │    │             └ None
           │           │    └ <function Server.serve at 0x7f9ba5159800>
           │           └ <uvicorn.server.Server object at 0x7f9ba526e6d0>
           └ <function asyncio_run at 0x7f9ba52c6200>
  File "/usr/local/lib/python3.11/site-packages/uvicorn/_compat.py", line 30, in asyncio_run
    return runner.run(main)
           │      │   └ <coroutine object Server.serve at 0x7f9ba4fff1f0>
           │      └ <function Runner.run at 0x7f9ba5514fe0>
           └ <asyncio.runners.Runner object at 0x7f9ba50a5490>
  File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           │    │     │                  └ <Task pending name='Task-1' coro=<Server.serve() running at /usr/local/lib/python3.11/site-packages/uvicorn/server.py:79> wai...
           │    │     └ <cyfunction Loop.run_until_complete at 0x7f9ba4dbe740>
           │    └ <uvloop.Loop running=True closed=False debug=False>
           └ <asyncio.runners.Runner object at 0x7f9ba50a5490>
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 144, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
          │    │   │      │                      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0x7f9b241e8360>
          │    │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          │    └ <open_webui.utils.asgi_middleware.RedirectMiddleware object at 0x7f9b25795090>
          └ <open_webui.utils.security_headers.SecurityHeadersMiddleware object at 0x7f9b25978350>
  File "/app/backend/open_webui/utils/asgi_middleware.py", line 258, in __call__
    await self.app(scope, receive, send)
          │    │   │      │        └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0x7f9b241e8360>
          │    │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          │    └ <starlette_compress.CompressMiddleware object at 0x7f9ba4dec900>
          └ <open_webui.utils.asgi_middleware.RedirectMiddleware object at 0x7f9b25795090>
  File "/usr/local/lib/python3.11/site-packages/starlette_compress/__init__.py", line 104, in __call__
    return await self._zstd(scope, receive, send)
                 │    │     │      │        └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0x7f9b241e8360>
                 │    │     │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
                 │    │     └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
                 │    └ <member '_zstd' of 'CompressMiddleware' objects>
                 └ <starlette_compress.CompressMiddleware object at 0x7f9ba4dec900>
  File "/usr/local/lib/python3.11/site-packages/starlette_compress/_zstd_legacy.py", line 107, in __call__
    await self.app(scope, receive, wrapper)
          │    │   │      │        └ <function ZstdResponder.__call__.<locals>.wrapper at 0x7f9b241e8400>
          │    │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          │    └ <member 'app' of 'ZstdResponder' objects>
          └ <starlette_compress._zstd_legacy.ZstdResponder object at 0x7f9b25a8e0c0>
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 63, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
          │                            │    │    │     │      │        └ <function ZstdResponder.__call__.<locals>.wrapper at 0x7f9b241e8400>
          │                            │    │    │     │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │                            │    │    │     └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          │                            │    │    └ <starlette.requests.Request object at 0x7f9b243a3250>
          │                            │    └ <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x7f9b25e38290>
          │                            └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x7f9b25e29290>
          └ <function wrap_app_handling_exceptions at 0x7f9ba27e0040>
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
    await app(scope, receive, sender)
          │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0>
          │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          └ <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x7f9b25e38290>
  File "/usr/local/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
    await self.app(scope, receive, send)
          │    │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0>
          │    │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          │    └ <fastapi.routing.APIRouter object at 0x7f9b27120190>
          └ <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x7f9b25e38290>
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 660, in __call__
    await self.middleware_stack(scope, receive, send)
          │    │                │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0>
          │    │                │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │    │                └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          │    └ <bound method Router.app of <fastapi.routing.APIRouter object at 0x7f9b27120190>>
          └ <fastapi.routing.APIRouter object at 0x7f9b27120190>
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 680, in app
    await route.handle(scope, receive, send)
          │     │      │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0>
          │     │      │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │     │      └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          │     └ <function Route.handle at 0x7f9ba27e16c0>
          └ APIRoute(path='/openai/models/{url_idx}', name='get_models', methods=['GET'])
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
          │    │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0>
          │    │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          │    └ <function request_response.<locals>.app at 0x7f9b25e7c2c0>
          └ APIRoute(path='/openai/models/{url_idx}', name='get_models', methods=['GET'])
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 130, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
          │                            │    │        │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0>
          │                            │    │        │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │                            │    │        └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          │                            │    └ <starlette.requests.Request object at 0x7f9b243a1cd0>
          │                            └ <function request_response.<locals>.app.<locals>.app at 0x7f9b241eafc0>
          └ <function wrap_app_handling_exceptions at 0x7f9ba27e0040>
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
    await app(scope, receive, sender)
          │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241ebe20>
          │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940>
          │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '...
          └ <function request_response.<locals>.app.<locals>.app at 0x7f9b241eafc0>
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 116, in app
    response = await f(request)
                     │ └ <starlette.requests.Request object at 0x7f9b243a1cd0>
                     └ <function get_request_handler.<locals>.app at 0x7f9b25e7c360>
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 670, in app
    raw_response = await run_endpoint_function(
                         └ <function run_endpoint_function at 0x7f9ba27e3740>
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 324, in run_endpoint_function
    return await dependant.call(**values)
                 │         │      └ {'user': UserModel(id='ce763973-4480-412c-86a2-3fa1a717a802', email='user@company', username=None, role='admi...
                 │         └ <function get_models at 0x7f9b2c4a54e0>
                 └ Dependant(path_params=[ModelField(field_info=Path(PydanticUndefined), name='url_idx', mode='validation', config=None)], query...
> File "/app/backend/open_webui/routers/openai.py", line 602, in get_models
    async with session.get(
               │       └ <function ClientSession.get at 0x7f9ba2b396c0>
               └ <aiohttp.client.ClientSession object at 0x7f9b24227650>
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 1521, in __aenter__
    self._resp: _RetType = await self._coro
    │    │                       │    └ <member '_coro' of '_BaseRequestContextManager' objects>
    │    │                       └ <aiohttp.client._BaseRequestContextManager object at 0x7f9b2438f2b0>
    │    └ <member '_resp' of '_BaseRequestContextManager' objects>
    └ <aiohttp.client._BaseRequestContextManager object at 0x7f9b2438f2b0>
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 788, in _request
    resp = await handler(req)
                 │       └ <aiohttp.client_reqrep.ClientRequest object at 0x7f9b241f4190>
                 └ <function ClientSession._request.<locals>._connect_and_send_request at 0x7f9b241ebf60>
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 766, in _connect_and_send_request
    await resp.start(conn)
          │    │     └ Connection<ConnectionKey(host='10.53.209.12', port=8000, is_ssl=False, ssl=True, proxy=URL('http://proxy.company.de:3128/'), p...
          │    └ <function ClientResponse.start at 0x7f9ba2aa54e0>
          └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]>
            None
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 534, in start
    with self._timer:
         │    └ <aiohttp.helpers.TimerContext object at 0x7f9b242afad0>
         └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]>
           None
  File "/usr/local/lib/python3.11/site-packages/aiohttp/helpers.py", line 713, in __exit__
    raise asyncio.TimeoutError from exc_val
          │       │                 └ CancelledError()
          │       └ <class 'TimeoutError'>
          └ <module 'asyncio' from '/usr/local/lib/python3.11/asyncio/__init__.py'>
TimeoutError
2026-04-26 11:35:23.972 | ERROR    | open_webui.routers.ollama:send_get_request:121 - Connection error:

I have two (vllm, ollama) http endpoints for models. And my model list is empty.

I run my setup in docker in an company environment with an proxy. no_proxy and NO_PROXY is set.

With 0.8.12 all works fine.

Actual Behavior

Model list is empty.

Steps to Reproduce

I had same problems aftrer upgrade to 0.9.1.

Logs & Screenshots

See above.

Additional Information

No response

Originally created by @somera on GitHub (Apr 26, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/24153 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version 0.9.2 ### Ollama Version (if applicable) 0.21.2 ### Operating System Ubuntu 24.04.4 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior I updated from 0.8.12 to 0.9.2 and my instance can't get any modell list: ``` 2026-04-26 11:35:23.951 | ERROR | open_webui.routers.openai:get_models:643 - Unexpected error: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 539, in start message, payload = await protocol.read() # type: ignore[union-attr] │ └ <function DataQueue.read at 0x7f9ba2e35e40> └ <aiohttp.client_proto.ResponseHandler object at 0x7f9b24289da0> File "/usr/local/lib/python3.11/site-packages/aiohttp/streams.py", line 707, in read await self._waiter │ └ None └ <aiohttp.client_proto.ResponseHandler object at 0x7f9b24289da0> asyncio.exceptions.CancelledError The above exception was the direct cause of the following exception: Traceback (most recent call last): File "<frozen runpy>", line 198, in _run_module_as_main File "<frozen runpy>", line 88, in _run_code File "/usr/local/lib/python3.11/site-packages/uvicorn/__main__.py", line 4, in <module> uvicorn.main() │ └ <Command main> └ <module 'uvicorn' from '/usr/local/lib/python3.11/site-packages/uvicorn/__init__.py'> File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1514, in __call__ return self.main(*args, **kwargs) │ │ │ └ {} │ │ └ () │ └ <function Command.main at 0x7f9ba52c3240> └ <Command main> File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1435, in main rv = self.invoke(ctx) │ │ └ <click.core.Context object at 0x7f9ba5fbcc50> │ └ <function Command.invoke at 0x7f9ba52c2f20> └ <Command main> File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1298, in invoke return ctx.invoke(self.callback, **ctx.params) │ │ │ │ │ └ {'host': '0.0.0.0', 'port': 8080, 'forwarded_allow_ips': '*', 'workers': 1, 'app': 'open_webui.main:app', 'uds': None, 'fd': ... │ │ │ │ └ <click.core.Context object at 0x7f9ba5fbcc50> │ │ │ └ <function main at 0x7f9ba50982c0> │ │ └ <Command main> │ └ <function Context.invoke at 0x7f9ba52c2160> └ <click.core.Context object at 0x7f9ba5fbcc50> File "/usr/local/lib/python3.11/site-packages/click/core.py", line 853, in invoke return callback(*args, **kwargs) │ │ └ {'host': '0.0.0.0', 'port': 8080, 'forwarded_allow_ips': '*', 'workers': 1, 'app': 'open_webui.main:app', 'uds': None, 'fd': ... │ └ () └ <function main at 0x7f9ba50982c0> File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 433, in main run( └ <function run at 0x7f9ba5159ee0> File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 606, in run server.run() │ └ <function Server.run at 0x7f9ba5159760> └ <uvicorn.server.Server object at 0x7f9ba526e6d0> File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 75, in run return asyncio_run(self.serve(sockets=sockets), loop_factory=self.config.get_loop_factory()) │ │ │ │ │ │ └ <function Config.get_loop_factory at 0x7f9ba5290400> │ │ │ │ │ └ <uvicorn.config.Config object at 0x7f9ba509d850> │ │ │ │ └ <uvicorn.server.Server object at 0x7f9ba526e6d0> │ │ │ └ None │ │ └ <function Server.serve at 0x7f9ba5159800> │ └ <uvicorn.server.Server object at 0x7f9ba526e6d0> └ <function asyncio_run at 0x7f9ba52c6200> File "/usr/local/lib/python3.11/site-packages/uvicorn/_compat.py", line 30, in asyncio_run return runner.run(main) │ │ └ <coroutine object Server.serve at 0x7f9ba4fff1f0> │ └ <function Runner.run at 0x7f9ba5514fe0> └ <asyncio.runners.Runner object at 0x7f9ba50a5490> File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) │ │ │ └ <Task pending name='Task-1' coro=<Server.serve() running at /usr/local/lib/python3.11/site-packages/uvicorn/server.py:79> wai... │ │ └ <cyfunction Loop.run_until_complete at 0x7f9ba4dbe740> │ └ <uvloop.Loop running=True closed=False debug=False> └ <asyncio.runners.Runner object at 0x7f9ba50a5490> File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 144, in coro await self.app(scope, receive_or_disconnect, send_no_error) │ │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0x7f9b241e8360> │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... │ └ <open_webui.utils.asgi_middleware.RedirectMiddleware object at 0x7f9b25795090> └ <open_webui.utils.security_headers.SecurityHeadersMiddleware object at 0x7f9b25978350> File "/app/backend/open_webui/utils/asgi_middleware.py", line 258, in __call__ await self.app(scope, receive, send) │ │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0x7f9b241e8360> │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... │ └ <starlette_compress.CompressMiddleware object at 0x7f9ba4dec900> └ <open_webui.utils.asgi_middleware.RedirectMiddleware object at 0x7f9b25795090> File "/usr/local/lib/python3.11/site-packages/starlette_compress/__init__.py", line 104, in __call__ return await self._zstd(scope, receive, send) │ │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0x7f9b241e8360> │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... │ └ <member '_zstd' of 'CompressMiddleware' objects> └ <starlette_compress.CompressMiddleware object at 0x7f9ba4dec900> File "/usr/local/lib/python3.11/site-packages/starlette_compress/_zstd_legacy.py", line 107, in __call__ await self.app(scope, receive, wrapper) │ │ │ │ └ <function ZstdResponder.__call__.<locals>.wrapper at 0x7f9b241e8400> │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... │ └ <member 'app' of 'ZstdResponder' objects> └ <starlette_compress._zstd_legacy.ZstdResponder object at 0x7f9b25a8e0c0> File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 63, in __call__ await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) │ │ │ │ │ │ └ <function ZstdResponder.__call__.<locals>.wrapper at 0x7f9b241e8400> │ │ │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ │ │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... │ │ │ └ <starlette.requests.Request object at 0x7f9b243a3250> │ │ └ <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x7f9b25e38290> │ └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x7f9b25e29290> └ <function wrap_app_handling_exceptions at 0x7f9ba27e0040> File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app await app(scope, receive, sender) │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0> │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... └ <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x7f9b25e38290> File "/usr/local/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__ await self.app(scope, receive, send) │ │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0> │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... │ └ <fastapi.routing.APIRouter object at 0x7f9b27120190> └ <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x7f9b25e38290> File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 660, in __call__ await self.middleware_stack(scope, receive, send) │ │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0> │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... │ └ <bound method Router.app of <fastapi.routing.APIRouter object at 0x7f9b27120190>> └ <fastapi.routing.APIRouter object at 0x7f9b27120190> File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 680, in app await route.handle(scope, receive, send) │ │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0> │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... │ └ <function Route.handle at 0x7f9ba27e16c0> └ APIRoute(path='/openai/models/{url_idx}', name='get_models', methods=['GET']) File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 276, in handle await self.app(scope, receive, send) │ │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0> │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... │ └ <function request_response.<locals>.app at 0x7f9b25e7c2c0> └ APIRoute(path='/openai/models/{url_idx}', name='get_models', methods=['GET']) File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 130, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) │ │ │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241e8cc0> │ │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... │ │ └ <starlette.requests.Request object at 0x7f9b243a1cd0> │ └ <function request_response.<locals>.app.<locals>.app at 0x7f9b241eafc0> └ <function wrap_app_handling_exceptions at 0x7f9ba27e0040> File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app await app(scope, receive, sender) │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f9b241ebe20> │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f9b241e9940> │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('10.255.10.2', 8080), '... └ <function request_response.<locals>.app.<locals>.app at 0x7f9b241eafc0> File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 116, in app response = await f(request) │ └ <starlette.requests.Request object at 0x7f9b243a1cd0> └ <function get_request_handler.<locals>.app at 0x7f9b25e7c360> File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 670, in app raw_response = await run_endpoint_function( └ <function run_endpoint_function at 0x7f9ba27e3740> File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 324, in run_endpoint_function return await dependant.call(**values) │ │ └ {'user': UserModel(id='ce763973-4480-412c-86a2-3fa1a717a802', email='user@company', username=None, role='admi... │ └ <function get_models at 0x7f9b2c4a54e0> └ Dependant(path_params=[ModelField(field_info=Path(PydanticUndefined), name='url_idx', mode='validation', config=None)], query... > File "/app/backend/open_webui/routers/openai.py", line 602, in get_models async with session.get( │ └ <function ClientSession.get at 0x7f9ba2b396c0> └ <aiohttp.client.ClientSession object at 0x7f9b24227650> File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 1521, in __aenter__ self._resp: _RetType = await self._coro │ │ │ └ <member '_coro' of '_BaseRequestContextManager' objects> │ │ └ <aiohttp.client._BaseRequestContextManager object at 0x7f9b2438f2b0> │ └ <member '_resp' of '_BaseRequestContextManager' objects> └ <aiohttp.client._BaseRequestContextManager object at 0x7f9b2438f2b0> File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 788, in _request resp = await handler(req) │ └ <aiohttp.client_reqrep.ClientRequest object at 0x7f9b241f4190> └ <function ClientSession._request.<locals>._connect_and_send_request at 0x7f9b241ebf60> File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 766, in _connect_and_send_request await resp.start(conn) │ │ └ Connection<ConnectionKey(host='10.53.209.12', port=8000, is_ssl=False, ssl=True, proxy=URL('http://proxy.company.de:3128/'), p... │ └ <function ClientResponse.start at 0x7f9ba2aa54e0> └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]> None File "/usr/local/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 534, in start with self._timer: │ └ <aiohttp.helpers.TimerContext object at 0x7f9b242afad0> └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]> None File "/usr/local/lib/python3.11/site-packages/aiohttp/helpers.py", line 713, in __exit__ raise asyncio.TimeoutError from exc_val │ │ └ CancelledError() │ └ <class 'TimeoutError'> └ <module 'asyncio' from '/usr/local/lib/python3.11/asyncio/__init__.py'> TimeoutError 2026-04-26 11:35:23.972 | ERROR | open_webui.routers.ollama:send_get_request:121 - Connection error: ``` I have two (vllm, ollama) http endpoints for models. And my model list is empty. I run my setup in docker in an company environment with an proxy. no_proxy and NO_PROXY is set. With 0.8.12 all works fine. ### Actual Behavior Model list is empty. ### Steps to Reproduce I had same problems aftrer upgrade to 0.9.1. ### Logs & Screenshots See above. ### Additional Information _No response_
GiteaMirror added the bug label 2026-05-06 00:19:35 -05:00
Author
Owner

@somera commented on GitHub (Apr 26, 2026):

I have the http|https|no_proxy configuration which works fine in 0.8.12 (https://docs.openwebui.com/reference/env-configuration#proxy-settings). But I see in the stacktrackes that the company proxy is blocking. which is wrong.

Why?

Cause we need this entvironment I go back to by 0.8.12 backup.

<!-- gh-comment-id:4321764174 --> @somera commented on GitHub (Apr 26, 2026): I have the http|https|no_proxy configuration which works fine in 0.8.12 (https://docs.openwebui.com/reference/env-configuration#proxy-settings). But I see in the stacktrackes that the company proxy is blocking. which is wrong. Why? Cause we need this entvironment I go back to by 0.8.12 backup.
Author
Owner

@rgaricano commented on GitHub (Apr 26, 2026):

you have a timeout error, likely because v0.9.1 introduced stricter timeout controls that don't account for proxy-induced delays.
You can try setting (or increasing) the AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST env var (e.g. 60, or disable timeout entirely AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST="")

<!-- gh-comment-id:4321778760 --> @rgaricano commented on GitHub (Apr 26, 2026): you have a timeout error, likely because v0.9.1 introduced stricter timeout controls that don't account for proxy-induced delays. You can try setting (or increasing) the `AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST` env var (e.g. 60, or disable timeout entirely `AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST=""`)
Author
Owner

@somera commented on GitHub (Apr 26, 2026):

The time to get the model list for both endpoints is <1 second. I added now 10s timeout:

      # Base app settings
      WEBUI_URL: ${WEBUI_URL:-http://localhost:3000}
      WEBUI_SECRET_KEY: ${WEBUI_SECRET_KEY}
      ENABLE_SIGNUP: ${ENABLE_SIGNUP:-false}
      DEFAULT_USER_ROLE: ${DEFAULT_USER_ROLE:-pending}
      WEBUI_ADMIN_EMAIL: ${WEBUI_ADMIN_EMAIL}
      WEBUI_ADMIN_PASSWORD: ${WEBUI_ADMIN_PASSWORD}
      ENABLE_VERSION_UPDATE_CHECK: ${ENABLE_VERSION_UPDATE_CHECK:-false}
      OFFLINE_MODE: ${OFFLINE_MODE:-false}
      CORS_ALLOW_ORIGIN: ${CORS_ALLOW_ORIGIN:-http://localhost:3000}
      WEBUI_SESSION_COOKIE_SECURE: ${WEBUI_SESSION_COOKIE_SECURE:-false}
      WEBUI_AUTH_COOKIE_SECURE: ${WEBUI_AUTH_COOKIE_SECURE:-false}

      # Set a shorter timeout (in seconds) for faster failure on unreachable endpoints
      AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST: 10

Same problem.

Why I see the company proxy in the stack trace?

File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 766, in _connect_and_send_request
    await resp.start(conn)
          │    │     └ Connection<ConnectionKey(host='10.53.209.12', port=8000, is_ssl=False, ssl=True, proxy=URL('http://proxy.company.de:3128/'), p...
          │    └ <function ClientResponse.start at 0x7f9ba2aa54e0>
          └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]>
            None
<!-- gh-comment-id:4321793266 --> @somera commented on GitHub (Apr 26, 2026): The time to get the model list for both endpoints is <1 second. I added now 10s timeout: ``` # Base app settings WEBUI_URL: ${WEBUI_URL:-http://localhost:3000} WEBUI_SECRET_KEY: ${WEBUI_SECRET_KEY} ENABLE_SIGNUP: ${ENABLE_SIGNUP:-false} DEFAULT_USER_ROLE: ${DEFAULT_USER_ROLE:-pending} WEBUI_ADMIN_EMAIL: ${WEBUI_ADMIN_EMAIL} WEBUI_ADMIN_PASSWORD: ${WEBUI_ADMIN_PASSWORD} ENABLE_VERSION_UPDATE_CHECK: ${ENABLE_VERSION_UPDATE_CHECK:-false} OFFLINE_MODE: ${OFFLINE_MODE:-false} CORS_ALLOW_ORIGIN: ${CORS_ALLOW_ORIGIN:-http://localhost:3000} WEBUI_SESSION_COOKIE_SECURE: ${WEBUI_SESSION_COOKIE_SECURE:-false} WEBUI_AUTH_COOKIE_SECURE: ${WEBUI_AUTH_COOKIE_SECURE:-false} # Set a shorter timeout (in seconds) for faster failure on unreachable endpoints AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST: 10 ``` Same problem. Why I see the company proxy in the stack trace? ``` File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 766, in _connect_and_send_request await resp.start(conn) │ │ └ Connection<ConnectionKey(host='10.53.209.12', port=8000, is_ssl=False, ssl=True, proxy=URL('http://proxy.company.de:3128/'), p... │ └ <function ClientResponse.start at 0x7f9ba2aa54e0> └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]> None ```
Author
Owner

@somera commented on GitHub (Apr 26, 2026):

With the 10 seconds timeout:

> File "/app/backend/open_webui/routers/openai.py", line 602, in get_models
    async with session.get(
               │       └ <function ClientSession.get at 0x7e09ddeed6c0>
               └ <aiohttp.client.ClientSession object at 0x7e095873d690>
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 1521, in __aenter__
    self._resp: _RetType = await self._coro
    │    │                       │    └ <member '_coro' of '_BaseRequestContextManager' objects>
    │    │                       └ <aiohttp.client._BaseRequestContextManager object at 0x7e09587094e0>
    │    └ <member '_resp' of '_BaseRequestContextManager' objects>
    └ <aiohttp.client._BaseRequestContextManager object at 0x7e09587094e0>
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 788, in _request
    resp = await handler(req)
                 │       └ <aiohttp.client_reqrep.ClientRequest object at 0x7e095872f810>
                 └ <function ClientSession._request.<locals>._connect_and_send_request at 0x7e0960227100>
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 766, in _connect_and_send_request
    await resp.start(conn)
          │    │     └ Connection<ConnectionKey(host='10.53.209.12', port=8000, is_ssl=False, ssl=True, proxy=URL('http://proxy.company.de:3128'), p...
          │    └ <function ClientResponse.start at 0x7e09dde594e0>
          └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]>
            None
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 534, in start
    with self._timer:
         │    └ <aiohttp.helpers.TimerContext object at 0x7e0958752f30>
         └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]>
           None
  File "/usr/local/lib/python3.11/site-packages/aiohttp/helpers.py", line 713, in __exit__
    raise asyncio.TimeoutError from exc_val
          │       │                 └ CancelledError()
          │       └ <class 'TimeoutError'>
          └ <module 'asyncio' from '/usr/local/lib/python3.11/asyncio/__init__.py'>
TimeoutError

But

$ curl -o /dev/null -s -w 'Total: %{time_total}s\n' http://10.53.209.12:8000/v1/models
Total: 0.030070s

on the same VM where Open WebUI is running.

<!-- gh-comment-id:4321806754 --> @somera commented on GitHub (Apr 26, 2026): With the 10 seconds timeout: ``` > File "/app/backend/open_webui/routers/openai.py", line 602, in get_models async with session.get( │ └ <function ClientSession.get at 0x7e09ddeed6c0> └ <aiohttp.client.ClientSession object at 0x7e095873d690> File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 1521, in __aenter__ self._resp: _RetType = await self._coro │ │ │ └ <member '_coro' of '_BaseRequestContextManager' objects> │ │ └ <aiohttp.client._BaseRequestContextManager object at 0x7e09587094e0> │ └ <member '_resp' of '_BaseRequestContextManager' objects> └ <aiohttp.client._BaseRequestContextManager object at 0x7e09587094e0> File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 788, in _request resp = await handler(req) │ └ <aiohttp.client_reqrep.ClientRequest object at 0x7e095872f810> └ <function ClientSession._request.<locals>._connect_and_send_request at 0x7e0960227100> File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 766, in _connect_and_send_request await resp.start(conn) │ │ └ Connection<ConnectionKey(host='10.53.209.12', port=8000, is_ssl=False, ssl=True, proxy=URL('http://proxy.company.de:3128'), p... │ └ <function ClientResponse.start at 0x7e09dde594e0> └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]> None File "/usr/local/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 534, in start with self._timer: │ └ <aiohttp.helpers.TimerContext object at 0x7e0958752f30> └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]> None File "/usr/local/lib/python3.11/site-packages/aiohttp/helpers.py", line 713, in __exit__ raise asyncio.TimeoutError from exc_val │ │ └ CancelledError() │ └ <class 'TimeoutError'> └ <module 'asyncio' from '/usr/local/lib/python3.11/asyncio/__init__.py'> TimeoutError ``` But ``` $ curl -o /dev/null -s -w 'Total: %{time_total}s\n' http://10.53.209.12:8000/v1/models Total: 0.030070s ``` on the same VM where Open WebUI is running.
Author
Owner

@rgaricano commented on GitHub (Apr 26, 2026):

could do you try with AIOHTTP_CLIENT_SESSION_SSL: false

<!-- gh-comment-id:4321818026 --> @rgaricano commented on GitHub (Apr 26, 2026): could do you try with `AIOHTTP_CLIENT_SESSION_SSL: false`
Author
Owner

@somera commented on GitHub (Apr 26, 2026):

With:

      AIOHTTP_CLIENT_SESSION_SSL: "false"
      AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST: "10"

I don't see the stack traces now. Only errors:

2026-04-26 12:31:35.923 | ERROR    | open_webui.routers.openai:send_get_request:121 - Connection error: 
2026-04-26 12:31:35.924 | ERROR    | open_webui.routers.ollama:send_get_request:121 - Connection error: 
2026-04-26 12:31:46.922 | ERROR    | open_webui.routers.ollama:send_get_request:121 - Connection error: 
2026-04-26 12:31:46.924 | INFO     | uvicorn.protocols.http.httptools_impl:send:483 - 10.209.109.2:0 - "GET /api/models HTTP/1.1" 200
2026-04-26 12:31:59.925 | INFO     | uvicorn.protocols.http.httptools_impl:send:483 - 10.209.109.2:0 - "GET /_app/version.json HTTP/1.1" 200
2026-04-26 12:32:30.602 | INFO     | uvicorn.protocols.http.httptools_impl:send:483 - 172.18.167.185:0 - "GET /_app/version.json HTTP/1.1" 200
2026-04-26 12:32:35.937 | INFO     | uvicorn.protocols.http.httptools_impl:send:483 - 10.209.109.2:0 - "GET /_app/version.json HTTP/1.1" 200
2026-04-26 12:32:52.556 | INFO     | open_webui.routers.openai:get_all_models:496 - get_all_models()
2026-04-26 12:32:52.556 | INFO     | open_webui.routers.ollama:get_all_models:329 - get_all_models()
2026-04-26 12:32:55.894 | INFO     | uvicorn.protocols.http.httptools_impl:send:483 - 10.209.109.2:0 - "GET / HTTP/1.1" 200
<!-- gh-comment-id:4321829346 --> @somera commented on GitHub (Apr 26, 2026): With: ``` AIOHTTP_CLIENT_SESSION_SSL: "false" AIOHTTP_CLIENT_TIMEOUT_MODEL_LIST: "10" ``` I don't see the stack traces now. Only errors: ``` 2026-04-26 12:31:35.923 | ERROR | open_webui.routers.openai:send_get_request:121 - Connection error: 2026-04-26 12:31:35.924 | ERROR | open_webui.routers.ollama:send_get_request:121 - Connection error: 2026-04-26 12:31:46.922 | ERROR | open_webui.routers.ollama:send_get_request:121 - Connection error: 2026-04-26 12:31:46.924 | INFO | uvicorn.protocols.http.httptools_impl:send:483 - 10.209.109.2:0 - "GET /api/models HTTP/1.1" 200 2026-04-26 12:31:59.925 | INFO | uvicorn.protocols.http.httptools_impl:send:483 - 10.209.109.2:0 - "GET /_app/version.json HTTP/1.1" 200 2026-04-26 12:32:30.602 | INFO | uvicorn.protocols.http.httptools_impl:send:483 - 172.18.167.185:0 - "GET /_app/version.json HTTP/1.1" 200 2026-04-26 12:32:35.937 | INFO | uvicorn.protocols.http.httptools_impl:send:483 - 10.209.109.2:0 - "GET /_app/version.json HTTP/1.1" 200 2026-04-26 12:32:52.556 | INFO | open_webui.routers.openai:get_all_models:496 - get_all_models() 2026-04-26 12:32:52.556 | INFO | open_webui.routers.ollama:get_all_models:329 - get_all_models() 2026-04-26 12:32:55.894 | INFO | uvicorn.protocols.http.httptools_impl:send:483 - 10.209.109.2:0 - "GET / HTTP/1.1" 200 ```
Author
Owner

@somera commented on GitHub (Apr 26, 2026):

Now I'm back to my v0.8.12 backup.

<!-- gh-comment-id:4321854601 --> @somera commented on GitHub (Apr 26, 2026): Now I'm back to my v0.8.12 backup.
Author
Owner

@somera commented on GitHub (Apr 27, 2026):

Can someone confirm a bug in Open WebUI?

<!-- gh-comment-id:4324695414 --> @somera commented on GitHub (Apr 27, 2026): Can someone confirm a bug in Open WebUI?
Author
Owner

@somera commented on GitHub (Apr 28, 2026):

I could upgrate other instance to v0.9.2. And it works.

I don't understand why here

  File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 766, in _connect_and_send_request
    await resp.start(conn)
          │    │     └ Connection<ConnectionKey(host='10.53.209.12', port=8000, is_ssl=False, ssl=True, proxy=URL('http://proxy.company.de:3128/'), p...
          │    └ <function ClientResponse.start at 0x7f9ba2aa54e0>
          └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]>
            None

is the proxy set. Cause the ip 10.53.209.12 is set in no_proxy. And the same setup works with v0.8.12 fine.

<!-- gh-comment-id:4333790263 --> @somera commented on GitHub (Apr 28, 2026): I could upgrate other instance to v0.9.2. And it works. I don't understand why here ``` File "/usr/local/lib/python3.11/site-packages/aiohttp/client.py", line 766, in _connect_and_send_request await resp.start(conn) │ │ └ Connection<ConnectionKey(host='10.53.209.12', port=8000, is_ssl=False, ssl=True, proxy=URL('http://proxy.company.de:3128/'), p... │ └ <function ClientResponse.start at 0x7f9ba2aa54e0> └ <ClientResponse(http://10.53.209.12:8000/v1/models) [None None]> None ``` is the proxy set. Cause the ip `10.53.209.12` is set in `no_proxy`. And the same setup works with v0.8.12 fine.
Author
Owner

@Classic298 commented on GitHub (May 2, 2026):

It is a real Open WebUI regression.

Cause: backend/open_webui/routers/openai.py:574 (and similar) uses aiohttp.ClientSession(trust_env=True) and relies entirely on aiohttp to honor NO_PROXY. The aiohttp==3.13.5 pin in backend/requirements.txt:16 has buggy NO_PROXY matching — 0.8.12 shipped an older aiohttp where it worked. Stack trace confirms: connection still carries proxy=URL('http://proxy.company.de:3128/') despite the IP being in no_proxy.
Fix: bypass the proxy explicitly in code, or change the aiohttp pin.

<!-- gh-comment-id:4363143137 --> @Classic298 commented on GitHub (May 2, 2026): It is a real Open WebUI regression. Cause: backend/open_webui/routers/openai.py:574 (and similar) uses aiohttp.ClientSession(trust_env=True) and relies entirely on aiohttp to honor NO_PROXY. The aiohttp==3.13.5 pin in backend/requirements.txt:16 has buggy NO_PROXY matching — 0.8.12 shipped an older aiohttp where it worked. Stack trace confirms: connection still carries proxy=URL('http://proxy.company.de:3128/') despite the IP being in no_proxy. Fix: bypass the proxy explicitly in code, or change the aiohttp pin.
Author
Owner

@somera commented on GitHub (May 3, 2026):

My upgrade works now. I added

      # Proxy config
      HTTP_PROXY: ${HTTP_PROXY}
      HTTPS_PROXY: ${HTTPS_PROXY}
      NO_PROXY: ${NO_PROXY}
      http_proxy: ${HTTP_PROXY}
      https_proxy: ${HTTPS_PROXY}
      no_proxy: ${NO_PROXY}

to my docker-compose.yaml.

<!-- gh-comment-id:4365826868 --> @somera commented on GitHub (May 3, 2026): My upgrade works now. I added ``` # Proxy config HTTP_PROXY: ${HTTP_PROXY} HTTPS_PROXY: ${HTTPS_PROXY} NO_PROXY: ${NO_PROXY} http_proxy: ${HTTP_PROXY} https_proxy: ${HTTPS_PROXY} no_proxy: ${NO_PROXY} ``` to my docker-compose.yaml.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#58878