[GH-ISSUE #14496] issue: Error: 'NoneType' object is not subscriptable #17276

Closed
opened 2026-04-19 22:59:27 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @iyogeshjoshi on GitHub (May 29, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/14496

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.6.12

Ollama Version (if applicable)

0.8.0

Operating System

macOS Sequoia

Browser (if applicable)

Brave 1.78.97, Arc 1.95.0, Safari 18.5

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

Whenever I'm chatting with any of the models, they should respond properly without throwing an error

Actual Behavior

Whenever I send a message to any of the models, be it OpenAI or Ollama, it throws the same error, which is 'NoneType' object is not subscriptable

Steps to Reproduce

Have a setup up and running a previous version of the open-webui image using Docker, which is using a local (not Docker) version of ollama created previously using the following command:

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Now, update the image using the following command :

docker run --rm -v /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui

Everything will work fine, but the following are the few things that are not working. Immediately, I lost access to all my ollama models, it's not listing any of the model anymore. I added OpenAI key, the listing seems to be working fine with it.

Logs & Screenshots

Browser logs:

Image

Docker logs

2025-05-29 21:45:59.722 | 2025-05-29 16:15:59.722 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET / HTTP/1.1" 304 - {}
2025-05-29 21:45:59.767 | 2025-05-29 16:15:59.767 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /static/loader.js HTTP/1.1" 304 - {}
2025-05-29 21:45:59.774 | 2025-05-29 16:15:59.773 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /static/splash.png HTTP/1.1" 304 - {}
2025-05-29 21:45:59.993 | 2025-05-29 16:15:59.993 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /static/splash-dark.png HTTP/1.1" 304 - {}
2025-05-29 21:46:00.031 | 2025-05-29 16:16:00.030 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:53816 - "GET /manifest.json HTTP/1.1" 200 - {}
2025-05-29 21:46:00.040 | 2025-05-29 16:16:00.039 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /static/favicon.ico HTTP/1.1" 304 - {}
2025-05-29 21:46:00.119 | 2025-05-29 16:16:00.118 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/config HTTP/1.1" 200 - {}
2025-05-29 21:46:00.126 | 2025-05-29 16:16:00.126 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /static/favicon.png HTTP/1.1" 304 - {}
2025-05-29 21:46:00.191 | 2025-05-29 16:16:00.190 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/auths/ HTTP/1.1" 200 - {}
2025-05-29 21:46:00.204 | 2025-05-29 16:16:00.204 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/config HTTP/1.1" 200 - {}
2025-05-29 21:46:00.247 | 2025-05-29 16:16:00.246 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/chats/archived?page=1&order_by=updated_at&direction=desc HTTP/1.1" 200 - {}
2025-05-29 21:46:00.252 | 2025-05-29 16:16:00.251 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:53816 - "GET /api/changelog HTTP/1.1" 200 - {}
2025-05-29 21:46:00.291 | 2025-05-29 16:16:00.273 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:21819 - "GET /api/v1/channels/ HTTP/1.1" 200 - {}
2025-05-29 21:46:00.302 | 2025-05-29 16:16:00.302 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {}
2025-05-29 21:46:00.308 | 2025-05-29 16:16:00.307 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/users/user/settings HTTP/1.1" 200 - {}
2025-05-29 21:46:00.338 | 2025-05-29 16:16:00.337 | INFO     | open_webui.routers.openai:get_all_models:391 - get_all_models() - {}
2025-05-29 21:46:00.338 | 2025-05-29 16:16:00.338 | INFO     | open_webui.routers.ollama:get_all_models:323 - get_all_models() - {}
2025-05-29 21:46:00.341 | 2025-05-29 16:16:00.341 | ERROR    | open_webui.routers.ollama:send_get_request:102 - Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] - {}
2025-05-29 21:46:00.345 | 2025-05-29 16:16:00.344 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/chats/all/tags HTTP/1.1" 200 - {}
2025-05-29 21:46:00.345 | 2025-05-29 16:16:00.345 | ERROR    | open_webui.routers.ollama:send_get_request:102 - Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] - {}
2025-05-29 21:46:00.358 | 2025-05-29 16:16:00.357 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/chats/pinned HTTP/1.1" 200 - {}
2025-05-29 21:46:00.380 | 2025-05-29 16:16:00.379 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/folders/ HTTP/1.1" 200 - {}
2025-05-29 21:46:00.390 | 2025-05-29 16:16:00.389 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:21819 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {}
2025-05-29 21:46:00.533 | 2025-05-29 16:16:00.533 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:21819 - "GET /api/v1/chats/?page=2 HTTP/1.1" 200 - {}
2025-05-29 21:46:01.447 | 2025-05-29 16:16:01.447 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:42978 - "GET /_app/version.json HTTP/1.1" 200 - {}
2025-05-29 21:46:01.779 | 2025-05-29 16:16:01.778 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /api/models HTTP/1.1" 200 - {}
2025-05-29 21:46:01.789 | 2025-05-29 16:16:01.789 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /api/v1/configs/banners HTTP/1.1" 200 - {}
2025-05-29 21:46:01.798 | 2025-05-29 16:16:01.797 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /api/v1/tools/ HTTP/1.1" 200 - {}
2025-05-29 21:46:01.822 | 2025-05-29 16:16:01.822 | ERROR    | open_webui.routers.ollama:send_get_request:102 - Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] - {}
2025-05-29 21:46:01.823 | 2025-05-29 16:16:01.822 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /ollama/api/version HTTP/1.1" 500 - {}
2025-05-29 21:46:01.862 | 2025-05-29 16:16:01.861 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /api/v1/users/user/settings HTTP/1.1" 200 - {}
2025-05-29 21:46:10.116 | 2025-05-29 16:16:10.115 | INFO     | open_webui.routers.openai:get_all_models:391 - get_all_models() - {}
2025-05-29 21:46:10.116 | 2025-05-29 16:16:10.116 | INFO     | open_webui.routers.ollama:get_all_models:323 - get_all_models() - {}
2025-05-29 21:46:10.118 | 2025-05-29 16:16:10.118 | ERROR    | open_webui.routers.ollama:send_get_request:102 - Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] - {}
2025-05-29 21:46:10.120 | 2025-05-29 16:16:10.119 | ERROR    | open_webui.routers.ollama:send_get_request:102 - Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] - {}
2025-05-29 21:46:12.035 | 2025-05-29 16:16:12.035 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "GET /api/models HTTP/1.1" 200 - {}
2025-05-29 21:46:15.844 | 2025-05-29 16:16:15.843 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "POST /api/v1/chats/new HTTP/1.1" 200 - {}
2025-05-29 21:46:15.914 | 2025-05-29 16:16:15.913 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "GET /api/v1/chats/?page=2 HTTP/1.1" 200 - {}
2025-05-29 21:46:15.965 | 2025-05-29 16:16:15.964 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "POST /api/v1/chats/3d345ffe-9434-4719-aeee-17ae64e4b29a HTTP/1.1" 200 - {}
2025-05-29 21:46:15.980 | 2025-05-29 16:16:15.979 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {}
2025-05-29 21:46:16.101 | 2025-05-29 16:16:16.078 | ERROR    | open_webui.retrieval.utils:generate_ollama_batch_embeddings:742 - Error generating ollama batch embeddings: 404 Client Error: Not Found for url: http://host.docker.internal:11434/api/embed - {}
2025-05-29 21:46:16.102 | Traceback (most recent call last):
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 |   File "<frozen runpy>", line 198, in _run_module_as_main
2025-05-29 21:46:16.102 |   File "<frozen runpy>", line 88, in _run_code
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/uvicorn/__main__.py", line 4, in <module>
2025-05-29 21:46:16.102 |     uvicorn.main()
2025-05-29 21:46:16.102 |     │       └ <Command main>
2025-05-29 21:46:16.102 |     └ <module 'uvicorn' from '/usr/local/lib/python3.11/site-packages/uvicorn/__init__.py'>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1442, in __call__
2025-05-29 21:46:16.102 |     return self.main(*args, **kwargs)
2025-05-29 21:46:16.102 |            │    │     │       └ {}
2025-05-29 21:46:16.102 |            │    │     └ ()
2025-05-29 21:46:16.102 |            │    └ <function Command.main at 0x7f302ae667a0>
2025-05-29 21:46:16.102 |            └ <Command main>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1363, in main
2025-05-29 21:46:16.102 |     rv = self.invoke(ctx)
2025-05-29 21:46:16.102 |          │    │      └ <click.core.Context object at 0x7f302bbe4710>
2025-05-29 21:46:16.102 |          │    └ <function Command.invoke at 0x7f302ae66480>
2025-05-29 21:46:16.102 |          └ <Command main>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1226, in invoke
2025-05-29 21:46:16.102 |     return ctx.invoke(self.callback, **ctx.params)
2025-05-29 21:46:16.102 |            │   │      │    │           │   └ {'host': '0.0.0.0', 'port': 8080, 'forwarded_allow_ips': '*', 'workers': 1, 'app': 'open_webui.main:app', 'uds': None, 'fd': ...
2025-05-29 21:46:16.102 |            │   │      │    │           └ <click.core.Context object at 0x7f302bbe4710>
2025-05-29 21:46:16.102 |            │   │      │    └ <function main at 0x7f302ac83a60>
2025-05-29 21:46:16.102 |            │   │      └ <Command main>
2025-05-29 21:46:16.102 |            │   └ <function Context.invoke at 0x7f302ae656c0>
2025-05-29 21:46:16.102 |            └ <click.core.Context object at 0x7f302bbe4710>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/click/core.py", line 794, in invoke
2025-05-29 21:46:16.102 |     return callback(*args, **kwargs)
2025-05-29 21:46:16.102 |            │         │       └ {'host': '0.0.0.0', 'port': 8080, 'forwarded_allow_ips': '*', 'workers': 1, 'app': 'open_webui.main:app', 'uds': None, 'fd': ...
2025-05-29 21:46:16.102 |            │         └ ()
2025-05-29 21:46:16.102 |            └ <function main at 0x7f302ac83a60>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 412, in main
2025-05-29 21:46:16.102 |     run(
2025-05-29 21:46:16.102 |     └ <function run at 0x7f302ae6b9c0>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 579, in run
2025-05-29 21:46:16.102 |     server.run()
2025-05-29 21:46:16.102 |     │      └ <function Server.run at 0x7f302aefce00>
2025-05-29 21:46:16.102 |     └ <uvicorn.server.Server object at 0x7f302ad84f10>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 66, in run
2025-05-29 21:46:16.102 |     return asyncio.run(self.serve(sockets=sockets))
2025-05-29 21:46:16.102 |            │       │   │    │             └ None
2025-05-29 21:46:16.102 |            │       │   │    └ <function Server.serve at 0x7f302aefcea0>
2025-05-29 21:46:16.102 |            │       │   └ <uvicorn.server.Server object at 0x7f302ad84f10>
2025-05-29 21:46:16.102 |            │       └ <function run at 0x7f302b1bd260>
2025-05-29 21:46:16.102 |            └ <module 'asyncio' from '/usr/local/lib/python3.11/asyncio/__init__.py'>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
2025-05-29 21:46:16.102 |     return runner.run(main)
2025-05-29 21:46:16.102 |            │      │   └ <coroutine object Server.serve at 0x7f302ac2e2f0>
2025-05-29 21:46:16.102 |            │      └ <function Runner.run at 0x7f302b030e00>
2025-05-29 21:46:16.102 |            └ <asyncio.runners.Runner object at 0x7f302b90ae90>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
2025-05-29 21:46:16.102 |     return self._loop.run_until_complete(task)
2025-05-29 21:46:16.102 |            │    │     │                  └ <Task pending name='Task-1' coro=<Server.serve() running at /usr/local/lib/python3.11/site-packages/uvicorn/server.py:70> wai...
2025-05-29 21:46:16.102 |            │    │     └ <cyfunction Loop.run_until_complete at 0x7f302ac6f440>
2025-05-29 21:46:16.102 |            │    └ <uvloop.Loop running=True closed=False debug=False>
2025-05-29 21:46:16.102 |            └ <asyncio.runners.Runner object at 0x7f302b90ae90>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 141, in coro
2025-05-29 21:46:16.102 |     await self.app(scope, receive_or_disconnect, send_no_error)
2025-05-29 21:46:16.102 |           │    │   │      │                      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0x7f2fdf3420c0>
2025-05-29 21:46:16.102 |           │    │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920>
2025-05-29 21:46:16.102 |           │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c...
2025-05-29 21:46:16.102 |           │    └ <starlette_compress.CompressMiddleware object at 0x7f3029fa22a0>
2025-05-29 21:46:16.102 |           └ <open_webui.main.RedirectMiddleware object at 0x7f2fdff53a10>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette_compress/__init__.py", line 92, in __call__
2025-05-29 21:46:16.102 |     return await self._zstd(scope, receive, send)
2025-05-29 21:46:16.102 |                  │    │     │      │        └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0x7f2fdf3420c0>
2025-05-29 21:46:16.102 |                  │    │     │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920>
2025-05-29 21:46:16.102 |                  │    │     └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c...
2025-05-29 21:46:16.102 |                  │    └ <member '_zstd' of 'CompressMiddleware' objects>
2025-05-29 21:46:16.102 |                  └ <starlette_compress.CompressMiddleware object at 0x7f3029fa22a0>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette_compress/_zstd_legacy.py", line 100, in __call__
2025-05-29 21:46:16.102 |     await self.app(scope, receive, wrapper)
2025-05-29 21:46:16.102 |           │    │   │      │        └ <function ZstdResponder.__call__.<locals>.wrapper at 0x7f2fdf7ec0e0>
2025-05-29 21:46:16.102 |           │    │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920>
2025-05-29 21:46:16.102 |           │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c...
2025-05-29 21:46:16.102 |           │    └ <member 'app' of 'ZstdResponder' objects>
2025-05-29 21:46:16.102 |           └ <starlette_compress._zstd_legacy.ZstdResponder object at 0x7f2fdf928c00>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
2025-05-29 21:46:16.102 |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
2025-05-29 21:46:16.102 |           │                            │    │    │     │      │        └ <function ZstdResponder.__call__.<locals>.wrapper at 0x7f2fdf7ec0e0>
2025-05-29 21:46:16.102 |           │                            │    │    │     │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920>
2025-05-29 21:46:16.102 |           │                            │    │    │     └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c...
2025-05-29 21:46:16.102 |           │                            │    │    └ <starlette.requests.Request object at 0x7f2fdf329750>
2025-05-29 21:46:16.102 |           │                            │    └ <fastapi.routing.APIRouter object at 0x7f2fe5791110>
2025-05-29 21:46:16.102 |           │                            └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x7f2fdffc5090>
2025-05-29 21:46:16.102 |           └ <function wrap_app_handling_exceptions at 0x7f3028171e40>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
2025-05-29 21:46:16.102 |     await app(scope, receive, sender)
2025-05-29 21:46:16.102 |           │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7eeac0>
2025-05-29 21:46:16.102 |           │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920>
2025-05-29 21:46:16.102 |           │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c...
2025-05-29 21:46:16.102 |           └ <fastapi.routing.APIRouter object at 0x7f2fe5791110>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 715, in __call__
2025-05-29 21:46:16.102 |     await self.middleware_stack(scope, receive, send)
2025-05-29 21:46:16.102 |           │    │                │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7eeac0>
2025-05-29 21:46:16.102 |           │    │                │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920>
2025-05-29 21:46:16.102 |           │    │                └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c...
2025-05-29 21:46:16.102 |           │    └ <bound method Router.app of <fastapi.routing.APIRouter object at 0x7f2fe5791110>>
2025-05-29 21:46:16.102 |           └ <fastapi.routing.APIRouter object at 0x7f2fe5791110>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 735, in app
2025-05-29 21:46:16.102 |     await route.handle(scope, receive, send)
2025-05-29 21:46:16.102 |           │     │      │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7eeac0>
2025-05-29 21:46:16.102 |           │     │      │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920>
2025-05-29 21:46:16.102 |           │     │      └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c...
2025-05-29 21:46:16.102 |           │     └ <function Route.handle at 0x7f3028173420>
2025-05-29 21:46:16.102 |           └ APIRoute(path='/api/chat/completions', name='chat_completion', methods=['POST'])
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
2025-05-29 21:46:16.102 |     await self.app(scope, receive, send)
2025-05-29 21:46:16.102 |           │    │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7eeac0>
2025-05-29 21:46:16.102 |           │    │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920>
2025-05-29 21:46:16.102 |           │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c...
2025-05-29 21:46:16.102 |           │    └ <function request_response.<locals>.app at 0x7f2fdf9c1d00>
2025-05-29 21:46:16.102 |           └ APIRoute(path='/api/chat/completions', name='chat_completion', methods=['POST'])
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 76, in app
2025-05-29 21:46:16.102 |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
2025-05-29 21:46:16.102 |           │                            │    │        │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7eeac0>
2025-05-29 21:46:16.102 |           │                            │    │        │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920>
2025-05-29 21:46:16.102 |           │                            │    │        └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c...
2025-05-29 21:46:16.102 |           │                            │    └ <starlette.requests.Request object at 0x7f2fdf216f90>
2025-05-29 21:46:16.102 |           │                            └ <function request_response.<locals>.app.<locals>.app at 0x7f2fdf7ef9c0>
2025-05-29 21:46:16.102 |           └ <function wrap_app_handling_exceptions at 0x7f3028171e40>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
2025-05-29 21:46:16.102 |     await app(scope, receive, sender)
2025-05-29 21:46:16.102 |           │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7efe20>
2025-05-29 21:46:16.102 |           │   │      └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920>
2025-05-29 21:46:16.102 |           │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c...
2025-05-29 21:46:16.102 |           └ <function request_response.<locals>.app.<locals>.app at 0x7f2fdf7ef9c0>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
2025-05-29 21:46:16.102 |     response = await f(request)
2025-05-29 21:46:16.102 |                      │ └ <starlette.requests.Request object at 0x7f2fdf216f90>
2025-05-29 21:46:16.102 |                      └ <function get_request_handler.<locals>.app at 0x7f2fdf9c1c60>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app
2025-05-29 21:46:16.102 |     raw_response = await run_endpoint_function(
2025-05-29 21:46:16.102 |                          └ <function run_endpoint_function at 0x7f3027f99260>
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 212, in run_endpoint_function
2025-05-29 21:46:16.102 |     return await dependant.call(**values)
2025-05-29 21:46:16.102 |                  │         │      └ {'user': UserModel(id='bc952897-60c8-46d0-add9-5c1e45cebe45', name='Yogi', email='me@iyogeshjoshi.com', role='admin', profile...
2025-05-29 21:46:16.102 |                  │         └ <function chat_completion at 0x7f2fdf9c0680>
2025-05-29 21:46:16.102 |                  └ Dependant(path_params=[], query_params=[], header_params=[], cookie_params=[], body_params=[ModelField(field_info=Body(Pydant...
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 |   File "/app/backend/open_webui/main.py", line 1243, in chat_completion
2025-05-29 21:46:16.102 |     form_data, metadata, events = await process_chat_payload(
2025-05-29 21:46:16.102 |     │          │                        └ <function process_chat_payload at 0x7f2fe4100860>
2025-05-29 21:46:16.102 |     │          └ {'user_id': 'bc952897-60c8-46d0-add9-5c1e45cebe45', 'chat_id': '3d345ffe-9434-4719-aeee-17ae64e4b29a', 'message_id': 'b92bdc6...
2025-05-29 21:46:16.102 |     └ {'stream': True, 'model': 'gpt-4o-audio-preview-2024-10-01', 'messages': [{'role': 'user', 'content': "Help me study vocabula...
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 |   File "/app/backend/open_webui/utils/middleware.py", line 805, in process_chat_payload
2025-05-29 21:46:16.102 |     form_data = await chat_memory_handler(
2025-05-29 21:46:16.102 |                       └ <function chat_memory_handler at 0x7f2fe4101580>
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 |   File "/app/backend/open_webui/utils/middleware.py", line 302, in chat_memory_handler
2025-05-29 21:46:16.102 |     results = await query_memory(
2025-05-29 21:46:16.102 |                     └ <function query_memory at 0x7f2fe5aba2a0>
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 |   File "/app/backend/open_webui/routers/memories.py", line 87, in query_memory
2025-05-29 21:46:16.102 |     vectors=[request.app.state.EMBEDDING_FUNCTION(form_data.content, user=user)],
2025-05-29 21:46:16.102 |              │       │                            │         │             └ UserModel(id='bc952897-60c8-46d0-add9-5c1e45cebe45', name='Yogi', email='me@iyogeshjoshi.com', role='admin', profile_image_ur...
2025-05-29 21:46:16.102 |              │       │                            │         └ "Help me study vocabulary: write a sentence for me to fill in the blank, and I'll try to pick the correct option."
2025-05-29 21:46:16.102 |              │       │                            └ QueryMemoryForm(content="Help me study vocabulary: write a sentence for me to fill in the blank, and I'll try to pick the cor...
2025-05-29 21:46:16.102 |              │       └ <property object at 0x7f30280cc590>
2025-05-29 21:46:16.102 |              └ <starlette.requests.Request object at 0x7f2fdf216f90>
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 |   File "/app/backend/open_webui/retrieval/utils.py", line 434, in <lambda>
2025-05-29 21:46:16.102 |     return lambda query, prefix=None, user=None: generate_multiple(
2025-05-29 21:46:16.102 |                   │                              └ <function get_embedding_function.<locals>.generate_multiple at 0x7f2fe4033740>
2025-05-29 21:46:16.102 |                   └ "Help me study vocabulary: write a sentence for me to fill in the blank, and I'll try to pick the correct option."
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 |   File "/app/backend/open_webui/retrieval/utils.py", line 432, in generate_multiple
2025-05-29 21:46:16.102 |     return func(query, prefix, user)
2025-05-29 21:46:16.102 |            │    │      │       └ UserModel(id='bc952897-60c8-46d0-add9-5c1e45cebe45', name='Yogi', email='me@iyogeshjoshi.com', role='admin', profile_image_ur...
2025-05-29 21:46:16.102 |            │    │      └ None
2025-05-29 21:46:16.102 |            │    └ "Help me study vocabulary: write a sentence for me to fill in the blank, and I'll try to pick the correct option."
2025-05-29 21:46:16.102 |            └ <function get_embedding_function.<locals>.<lambda> at 0x7f2fe40336a0>
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 |   File "/app/backend/open_webui/retrieval/utils.py", line 409, in <lambda>
2025-05-29 21:46:16.102 |     func = lambda query, prefix=None, user=None: generate_embeddings(
2025-05-29 21:46:16.102 |                   │                              └ <function generate_embeddings at 0x7f2fe6069a80>
2025-05-29 21:46:16.102 |                   └ "Help me study vocabulary: write a sentence for me to fill in the blank, and I'll try to pick the correct option."
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 |   File "/app/backend/open_webui/retrieval/utils.py", line 776, in generate_embeddings
2025-05-29 21:46:16.102 |     embeddings = generate_ollama_batch_embeddings(
2025-05-29 21:46:16.102 |                  └ <function generate_ollama_batch_embeddings at 0x7f2fe60699e0>
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 | > File "/app/backend/open_webui/retrieval/utils.py", line 734, in generate_ollama_batch_embeddings
2025-05-29 21:46:16.102 |     r.raise_for_status()
2025-05-29 21:46:16.102 |     │ └ <function Response.raise_for_status at 0x7f3028575120>
2025-05-29 21:46:16.102 |     └ <Response [404]>
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 |   File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1024, in raise_for_status
2025-05-29 21:46:16.102 |     raise HTTPError(http_error_msg, response=self)
2025-05-29 21:46:16.102 |           │         │                        └ <Response [404]>
2025-05-29 21:46:16.102 |           │         └ '404 Client Error: Not Found for url: http://host.docker.internal:11434/api/embed'
2025-05-29 21:46:16.102 |           └ <class 'requests.exceptions.HTTPError'>
2025-05-29 21:46:16.102 | 
2025-05-29 21:46:16.102 | requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://host.docker.internal:11434/api/embed
2025-05-29 21:46:16.115 | 2025-05-29 16:16:16.115 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "POST /api/chat/completions HTTP/1.1" 400 - {}
2025-05-29 21:46:16.145 | 2025-05-29 16:16:16.144 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {}

Docker env vars

"PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
			"LANG=C.UTF-8",
			"GPG_KEY=xxxxxxx",
			"PYTHON_VERSION=3.11.12",
			"PYTHON_SHA256=849da87af4df137710c1796e276a955f7a85c9f971081067c8f565d15c352a09",
			"ENV=prod",
			"PORT=8080",
			"USE_OLLAMA_DOCKER=false",
			"USE_CUDA_DOCKER=false",
			"USE_CUDA_DOCKER_VER=cu128",
			"USE_EMBEDDING_MODEL_DOCKER=sentence-transformers/all-MiniLM-L6-v2",
			"USE_RERANKING_MODEL_DOCKER=",
			"OLLAMA_BASE_URL=/ollama",
			"OPENAI_API_BASE_URL=",
			"OPENAI_API_KEY=",
			"WEBUI_SECRET_KEY=",
			"SCARF_NO_ANALYTICS=true",
			"DO_NOT_TRACK=true",
			"ANONYMIZED_TELEMETRY=false",
			"WHISPER_MODEL=base",
			"WHISPER_MODEL_DIR=/app/backend/data/cache/whisper/models",
			"RAG_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2",
			"RAG_RERANKING_MODEL=",
			"SENTENCE_TRANSFORMERS_HOME=/app/backend/data/cache/embedding/models",
			"TIKTOKEN_ENCODING_NAME=cl100k_base",
			"TIKTOKEN_CACHE_DIR=/app/backend/data/cache/tiktoken",
			"HF_HOME=/app/backend/data/cache/embedding/models",
			"HOME=/root",
			"WEBUI_BUILD_VERSION=ba0088f39b7a093920b142a5172554686f24df60",
			"DOCKER=true"

Additional Information

Even with the complete new installation of open-webui, be it on main branch or dev branch, I'm getting the same error; the only thing that remains common is the volume used previously. Although I was previously able to obtain the list of Ollama models, after attempting to fix the broken functionality, that also stopped happening. I have been and still am very fond of open-webui, I use it primarily for everything, but some post-upgrading to the latest image has broken the functionality. Still keep up the good work, team. Let me know if anything more is required for debugging. Happy to help.

PS: I have gone through the troubleshoot page and have tried updating the Ollama URL, tried both localhost and 127.0.0.1, still nothing works. Though I can confirm the local ollama APIs are working fine, I have confirmed.

UPDATE:
After changing a few Docker configs, I was able to resolve the issue of models not listing, but the main issue, where it throws the above-specified error, still persists.

Originally created by @iyogeshjoshi on GitHub (May 29, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/14496 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.6.12 ### Ollama Version (if applicable) 0.8.0 ### Operating System macOS Sequoia ### Browser (if applicable) Brave 1.78.97, Arc 1.95.0, Safari 18.5 ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior Whenever I'm chatting with any of the models, they should respond properly without throwing an error ### Actual Behavior Whenever I send a message to any of the models, be it OpenAI or Ollama, it throws the same error, which is `'NoneType' object is not subscriptable` ### Steps to Reproduce Have a setup up and running a previous version of the open-webui image using Docker, which is using a local (not Docker) version of ollama created previously using the following command: ```bash docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` Now, update the image using the following command : ```bash docker run --rm -v /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui ``` Everything will work fine, but the following are the few things that are not working. Immediately, I lost access to all my ollama models, it's not listing any of the model anymore. I added OpenAI key, the listing seems to be working fine with it. ### Logs & Screenshots **Browser logs:** <img width="1807" alt="Image" src="https://github.com/user-attachments/assets/6af787d5-7ddb-4bce-b50e-dc284bebbe57" /> **Docker logs** ```bash 2025-05-29 21:45:59.722 | 2025-05-29 16:15:59.722 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET / HTTP/1.1" 304 - {} 2025-05-29 21:45:59.767 | 2025-05-29 16:15:59.767 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /static/loader.js HTTP/1.1" 304 - {} 2025-05-29 21:45:59.774 | 2025-05-29 16:15:59.773 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /static/splash.png HTTP/1.1" 304 - {} 2025-05-29 21:45:59.993 | 2025-05-29 16:15:59.993 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /static/splash-dark.png HTTP/1.1" 304 - {} 2025-05-29 21:46:00.031 | 2025-05-29 16:16:00.030 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:53816 - "GET /manifest.json HTTP/1.1" 200 - {} 2025-05-29 21:46:00.040 | 2025-05-29 16:16:00.039 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /static/favicon.ico HTTP/1.1" 304 - {} 2025-05-29 21:46:00.119 | 2025-05-29 16:16:00.118 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/config HTTP/1.1" 200 - {} 2025-05-29 21:46:00.126 | 2025-05-29 16:16:00.126 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /static/favicon.png HTTP/1.1" 304 - {} 2025-05-29 21:46:00.191 | 2025-05-29 16:16:00.190 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/auths/ HTTP/1.1" 200 - {} 2025-05-29 21:46:00.204 | 2025-05-29 16:16:00.204 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/config HTTP/1.1" 200 - {} 2025-05-29 21:46:00.247 | 2025-05-29 16:16:00.246 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/chats/archived?page=1&order_by=updated_at&direction=desc HTTP/1.1" 200 - {} 2025-05-29 21:46:00.252 | 2025-05-29 16:16:00.251 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:53816 - "GET /api/changelog HTTP/1.1" 200 - {} 2025-05-29 21:46:00.291 | 2025-05-29 16:16:00.273 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:21819 - "GET /api/v1/channels/ HTTP/1.1" 200 - {} 2025-05-29 21:46:00.302 | 2025-05-29 16:16:00.302 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {} 2025-05-29 21:46:00.308 | 2025-05-29 16:16:00.307 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/users/user/settings HTTP/1.1" 200 - {} 2025-05-29 21:46:00.338 | 2025-05-29 16:16:00.337 | INFO | open_webui.routers.openai:get_all_models:391 - get_all_models() - {} 2025-05-29 21:46:00.338 | 2025-05-29 16:16:00.338 | INFO | open_webui.routers.ollama:get_all_models:323 - get_all_models() - {} 2025-05-29 21:46:00.341 | 2025-05-29 16:16:00.341 | ERROR | open_webui.routers.ollama:send_get_request:102 - Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] - {} 2025-05-29 21:46:00.345 | 2025-05-29 16:16:00.344 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/chats/all/tags HTTP/1.1" 200 - {} 2025-05-29 21:46:00.345 | 2025-05-29 16:16:00.345 | ERROR | open_webui.routers.ollama:send_get_request:102 - Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] - {} 2025-05-29 21:46:00.358 | 2025-05-29 16:16:00.357 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/chats/pinned HTTP/1.1" 200 - {} 2025-05-29 21:46:00.380 | 2025-05-29 16:16:00.379 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:59456 - "GET /api/v1/folders/ HTTP/1.1" 200 - {} 2025-05-29 21:46:00.390 | 2025-05-29 16:16:00.389 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:21819 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {} 2025-05-29 21:46:00.533 | 2025-05-29 16:16:00.533 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:21819 - "GET /api/v1/chats/?page=2 HTTP/1.1" 200 - {} 2025-05-29 21:46:01.447 | 2025-05-29 16:16:01.447 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:42978 - "GET /_app/version.json HTTP/1.1" 200 - {} 2025-05-29 21:46:01.779 | 2025-05-29 16:16:01.778 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /api/models HTTP/1.1" 200 - {} 2025-05-29 21:46:01.789 | 2025-05-29 16:16:01.789 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /api/v1/configs/banners HTTP/1.1" 200 - {} 2025-05-29 21:46:01.798 | 2025-05-29 16:16:01.797 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /api/v1/tools/ HTTP/1.1" 200 - {} 2025-05-29 21:46:01.822 | 2025-05-29 16:16:01.822 | ERROR | open_webui.routers.ollama:send_get_request:102 - Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] - {} 2025-05-29 21:46:01.823 | 2025-05-29 16:16:01.822 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /ollama/api/version HTTP/1.1" 500 - {} 2025-05-29 21:46:01.862 | 2025-05-29 16:16:01.861 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:34784 - "GET /api/v1/users/user/settings HTTP/1.1" 200 - {} 2025-05-29 21:46:10.116 | 2025-05-29 16:16:10.115 | INFO | open_webui.routers.openai:get_all_models:391 - get_all_models() - {} 2025-05-29 21:46:10.116 | 2025-05-29 16:16:10.116 | INFO | open_webui.routers.ollama:get_all_models:323 - get_all_models() - {} 2025-05-29 21:46:10.118 | 2025-05-29 16:16:10.118 | ERROR | open_webui.routers.ollama:send_get_request:102 - Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] - {} 2025-05-29 21:46:10.120 | 2025-05-29 16:16:10.119 | ERROR | open_webui.routers.ollama:send_get_request:102 - Connection error: Cannot connect to host 127.0.0.1:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)] - {} 2025-05-29 21:46:12.035 | 2025-05-29 16:16:12.035 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "GET /api/models HTTP/1.1" 200 - {} 2025-05-29 21:46:15.844 | 2025-05-29 16:16:15.843 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "POST /api/v1/chats/new HTTP/1.1" 200 - {} 2025-05-29 21:46:15.914 | 2025-05-29 16:16:15.913 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "GET /api/v1/chats/?page=2 HTTP/1.1" 200 - {} 2025-05-29 21:46:15.965 | 2025-05-29 16:16:15.964 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "POST /api/v1/chats/3d345ffe-9434-4719-aeee-17ae64e4b29a HTTP/1.1" 200 - {} 2025-05-29 21:46:15.980 | 2025-05-29 16:16:15.979 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {} 2025-05-29 21:46:16.101 | 2025-05-29 16:16:16.078 | ERROR | open_webui.retrieval.utils:generate_ollama_batch_embeddings:742 - Error generating ollama batch embeddings: 404 Client Error: Not Found for url: http://host.docker.internal:11434/api/embed - {} 2025-05-29 21:46:16.102 | Traceback (most recent call last): 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | File "<frozen runpy>", line 198, in _run_module_as_main 2025-05-29 21:46:16.102 | File "<frozen runpy>", line 88, in _run_code 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/uvicorn/__main__.py", line 4, in <module> 2025-05-29 21:46:16.102 | uvicorn.main() 2025-05-29 21:46:16.102 | │ └ <Command main> 2025-05-29 21:46:16.102 | └ <module 'uvicorn' from '/usr/local/lib/python3.11/site-packages/uvicorn/__init__.py'> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1442, in __call__ 2025-05-29 21:46:16.102 | return self.main(*args, **kwargs) 2025-05-29 21:46:16.102 | │ │ │ └ {} 2025-05-29 21:46:16.102 | │ │ └ () 2025-05-29 21:46:16.102 | │ └ <function Command.main at 0x7f302ae667a0> 2025-05-29 21:46:16.102 | └ <Command main> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1363, in main 2025-05-29 21:46:16.102 | rv = self.invoke(ctx) 2025-05-29 21:46:16.102 | │ │ └ <click.core.Context object at 0x7f302bbe4710> 2025-05-29 21:46:16.102 | │ └ <function Command.invoke at 0x7f302ae66480> 2025-05-29 21:46:16.102 | └ <Command main> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/click/core.py", line 1226, in invoke 2025-05-29 21:46:16.102 | return ctx.invoke(self.callback, **ctx.params) 2025-05-29 21:46:16.102 | │ │ │ │ │ └ {'host': '0.0.0.0', 'port': 8080, 'forwarded_allow_ips': '*', 'workers': 1, 'app': 'open_webui.main:app', 'uds': None, 'fd': ... 2025-05-29 21:46:16.102 | │ │ │ │ └ <click.core.Context object at 0x7f302bbe4710> 2025-05-29 21:46:16.102 | │ │ │ └ <function main at 0x7f302ac83a60> 2025-05-29 21:46:16.102 | │ │ └ <Command main> 2025-05-29 21:46:16.102 | │ └ <function Context.invoke at 0x7f302ae656c0> 2025-05-29 21:46:16.102 | └ <click.core.Context object at 0x7f302bbe4710> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/click/core.py", line 794, in invoke 2025-05-29 21:46:16.102 | return callback(*args, **kwargs) 2025-05-29 21:46:16.102 | │ │ └ {'host': '0.0.0.0', 'port': 8080, 'forwarded_allow_ips': '*', 'workers': 1, 'app': 'open_webui.main:app', 'uds': None, 'fd': ... 2025-05-29 21:46:16.102 | │ └ () 2025-05-29 21:46:16.102 | └ <function main at 0x7f302ac83a60> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 412, in main 2025-05-29 21:46:16.102 | run( 2025-05-29 21:46:16.102 | └ <function run at 0x7f302ae6b9c0> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/uvicorn/main.py", line 579, in run 2025-05-29 21:46:16.102 | server.run() 2025-05-29 21:46:16.102 | │ └ <function Server.run at 0x7f302aefce00> 2025-05-29 21:46:16.102 | └ <uvicorn.server.Server object at 0x7f302ad84f10> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 66, in run 2025-05-29 21:46:16.102 | return asyncio.run(self.serve(sockets=sockets)) 2025-05-29 21:46:16.102 | │ │ │ │ └ None 2025-05-29 21:46:16.102 | │ │ │ └ <function Server.serve at 0x7f302aefcea0> 2025-05-29 21:46:16.102 | │ │ └ <uvicorn.server.Server object at 0x7f302ad84f10> 2025-05-29 21:46:16.102 | │ └ <function run at 0x7f302b1bd260> 2025-05-29 21:46:16.102 | └ <module 'asyncio' from '/usr/local/lib/python3.11/asyncio/__init__.py'> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run 2025-05-29 21:46:16.102 | return runner.run(main) 2025-05-29 21:46:16.102 | │ │ └ <coroutine object Server.serve at 0x7f302ac2e2f0> 2025-05-29 21:46:16.102 | │ └ <function Runner.run at 0x7f302b030e00> 2025-05-29 21:46:16.102 | └ <asyncio.runners.Runner object at 0x7f302b90ae90> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run 2025-05-29 21:46:16.102 | return self._loop.run_until_complete(task) 2025-05-29 21:46:16.102 | │ │ │ └ <Task pending name='Task-1' coro=<Server.serve() running at /usr/local/lib/python3.11/site-packages/uvicorn/server.py:70> wai... 2025-05-29 21:46:16.102 | │ │ └ <cyfunction Loop.run_until_complete at 0x7f302ac6f440> 2025-05-29 21:46:16.102 | │ └ <uvloop.Loop running=True closed=False debug=False> 2025-05-29 21:46:16.102 | └ <asyncio.runners.Runner object at 0x7f302b90ae90> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 141, in coro 2025-05-29 21:46:16.102 | await self.app(scope, receive_or_disconnect, send_no_error) 2025-05-29 21:46:16.102 | │ │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0x7f2fdf3420c0> 2025-05-29 21:46:16.102 | │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920> 2025-05-29 21:46:16.102 | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c... 2025-05-29 21:46:16.102 | │ └ <starlette_compress.CompressMiddleware object at 0x7f3029fa22a0> 2025-05-29 21:46:16.102 | └ <open_webui.main.RedirectMiddleware object at 0x7f2fdff53a10> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette_compress/__init__.py", line 92, in __call__ 2025-05-29 21:46:16.102 | return await self._zstd(scope, receive, send) 2025-05-29 21:46:16.102 | │ │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0x7f2fdf3420c0> 2025-05-29 21:46:16.102 | │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920> 2025-05-29 21:46:16.102 | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c... 2025-05-29 21:46:16.102 | │ └ <member '_zstd' of 'CompressMiddleware' objects> 2025-05-29 21:46:16.102 | └ <starlette_compress.CompressMiddleware object at 0x7f3029fa22a0> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette_compress/_zstd_legacy.py", line 100, in __call__ 2025-05-29 21:46:16.102 | await self.app(scope, receive, wrapper) 2025-05-29 21:46:16.102 | │ │ │ │ └ <function ZstdResponder.__call__.<locals>.wrapper at 0x7f2fdf7ec0e0> 2025-05-29 21:46:16.102 | │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920> 2025-05-29 21:46:16.102 | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c... 2025-05-29 21:46:16.102 | │ └ <member 'app' of 'ZstdResponder' objects> 2025-05-29 21:46:16.102 | └ <starlette_compress._zstd_legacy.ZstdResponder object at 0x7f2fdf928c00> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__ 2025-05-29 21:46:16.102 | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) 2025-05-29 21:46:16.102 | │ │ │ │ │ │ └ <function ZstdResponder.__call__.<locals>.wrapper at 0x7f2fdf7ec0e0> 2025-05-29 21:46:16.102 | │ │ │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920> 2025-05-29 21:46:16.102 | │ │ │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c... 2025-05-29 21:46:16.102 | │ │ │ └ <starlette.requests.Request object at 0x7f2fdf329750> 2025-05-29 21:46:16.102 | │ │ └ <fastapi.routing.APIRouter object at 0x7f2fe5791110> 2025-05-29 21:46:16.102 | │ └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x7f2fdffc5090> 2025-05-29 21:46:16.102 | └ <function wrap_app_handling_exceptions at 0x7f3028171e40> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app 2025-05-29 21:46:16.102 | await app(scope, receive, sender) 2025-05-29 21:46:16.102 | │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7eeac0> 2025-05-29 21:46:16.102 | │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920> 2025-05-29 21:46:16.102 | │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c... 2025-05-29 21:46:16.102 | └ <fastapi.routing.APIRouter object at 0x7f2fe5791110> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 715, in __call__ 2025-05-29 21:46:16.102 | await self.middleware_stack(scope, receive, send) 2025-05-29 21:46:16.102 | │ │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7eeac0> 2025-05-29 21:46:16.102 | │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920> 2025-05-29 21:46:16.102 | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c... 2025-05-29 21:46:16.102 | │ └ <bound method Router.app of <fastapi.routing.APIRouter object at 0x7f2fe5791110>> 2025-05-29 21:46:16.102 | └ <fastapi.routing.APIRouter object at 0x7f2fe5791110> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 735, in app 2025-05-29 21:46:16.102 | await route.handle(scope, receive, send) 2025-05-29 21:46:16.102 | │ │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7eeac0> 2025-05-29 21:46:16.102 | │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920> 2025-05-29 21:46:16.102 | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c... 2025-05-29 21:46:16.102 | │ └ <function Route.handle at 0x7f3028173420> 2025-05-29 21:46:16.102 | └ APIRoute(path='/api/chat/completions', name='chat_completion', methods=['POST']) 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle 2025-05-29 21:46:16.102 | await self.app(scope, receive, send) 2025-05-29 21:46:16.102 | │ │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7eeac0> 2025-05-29 21:46:16.102 | │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920> 2025-05-29 21:46:16.102 | │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c... 2025-05-29 21:46:16.102 | │ └ <function request_response.<locals>.app at 0x7f2fdf9c1d00> 2025-05-29 21:46:16.102 | └ APIRoute(path='/api/chat/completions', name='chat_completion', methods=['POST']) 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 76, in app 2025-05-29 21:46:16.102 | await wrap_app_handling_exceptions(app, request)(scope, receive, send) 2025-05-29 21:46:16.102 | │ │ │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7eeac0> 2025-05-29 21:46:16.102 | │ │ │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920> 2025-05-29 21:46:16.102 | │ │ │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c... 2025-05-29 21:46:16.102 | │ │ └ <starlette.requests.Request object at 0x7f2fdf216f90> 2025-05-29 21:46:16.102 | │ └ <function request_response.<locals>.app.<locals>.app at 0x7f2fdf7ef9c0> 2025-05-29 21:46:16.102 | └ <function wrap_app_handling_exceptions at 0x7f3028171e40> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app 2025-05-29 21:46:16.102 | await app(scope, receive, sender) 2025-05-29 21:46:16.102 | │ │ │ └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x7f2fdf7efe20> 2025-05-29 21:46:16.102 | │ │ └ <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0x7f2fdf343920> 2025-05-29 21:46:16.102 | │ └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.17.0.2', 8080), 'c... 2025-05-29 21:46:16.102 | └ <function request_response.<locals>.app.<locals>.app at 0x7f2fdf7ef9c0> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 73, in app 2025-05-29 21:46:16.102 | response = await f(request) 2025-05-29 21:46:16.102 | │ └ <starlette.requests.Request object at 0x7f2fdf216f90> 2025-05-29 21:46:16.102 | └ <function get_request_handler.<locals>.app at 0x7f2fdf9c1c60> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app 2025-05-29 21:46:16.102 | raw_response = await run_endpoint_function( 2025-05-29 21:46:16.102 | └ <function run_endpoint_function at 0x7f3027f99260> 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 212, in run_endpoint_function 2025-05-29 21:46:16.102 | return await dependant.call(**values) 2025-05-29 21:46:16.102 | │ │ └ {'user': UserModel(id='bc952897-60c8-46d0-add9-5c1e45cebe45', name='Yogi', email='me@iyogeshjoshi.com', role='admin', profile... 2025-05-29 21:46:16.102 | │ └ <function chat_completion at 0x7f2fdf9c0680> 2025-05-29 21:46:16.102 | └ Dependant(path_params=[], query_params=[], header_params=[], cookie_params=[], body_params=[ModelField(field_info=Body(Pydant... 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | File "/app/backend/open_webui/main.py", line 1243, in chat_completion 2025-05-29 21:46:16.102 | form_data, metadata, events = await process_chat_payload( 2025-05-29 21:46:16.102 | │ │ └ <function process_chat_payload at 0x7f2fe4100860> 2025-05-29 21:46:16.102 | │ └ {'user_id': 'bc952897-60c8-46d0-add9-5c1e45cebe45', 'chat_id': '3d345ffe-9434-4719-aeee-17ae64e4b29a', 'message_id': 'b92bdc6... 2025-05-29 21:46:16.102 | └ {'stream': True, 'model': 'gpt-4o-audio-preview-2024-10-01', 'messages': [{'role': 'user', 'content': "Help me study vocabula... 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | File "/app/backend/open_webui/utils/middleware.py", line 805, in process_chat_payload 2025-05-29 21:46:16.102 | form_data = await chat_memory_handler( 2025-05-29 21:46:16.102 | └ <function chat_memory_handler at 0x7f2fe4101580> 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | File "/app/backend/open_webui/utils/middleware.py", line 302, in chat_memory_handler 2025-05-29 21:46:16.102 | results = await query_memory( 2025-05-29 21:46:16.102 | └ <function query_memory at 0x7f2fe5aba2a0> 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | File "/app/backend/open_webui/routers/memories.py", line 87, in query_memory 2025-05-29 21:46:16.102 | vectors=[request.app.state.EMBEDDING_FUNCTION(form_data.content, user=user)], 2025-05-29 21:46:16.102 | │ │ │ │ └ UserModel(id='bc952897-60c8-46d0-add9-5c1e45cebe45', name='Yogi', email='me@iyogeshjoshi.com', role='admin', profile_image_ur... 2025-05-29 21:46:16.102 | │ │ │ └ "Help me study vocabulary: write a sentence for me to fill in the blank, and I'll try to pick the correct option." 2025-05-29 21:46:16.102 | │ │ └ QueryMemoryForm(content="Help me study vocabulary: write a sentence for me to fill in the blank, and I'll try to pick the cor... 2025-05-29 21:46:16.102 | │ └ <property object at 0x7f30280cc590> 2025-05-29 21:46:16.102 | └ <starlette.requests.Request object at 0x7f2fdf216f90> 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | File "/app/backend/open_webui/retrieval/utils.py", line 434, in <lambda> 2025-05-29 21:46:16.102 | return lambda query, prefix=None, user=None: generate_multiple( 2025-05-29 21:46:16.102 | │ └ <function get_embedding_function.<locals>.generate_multiple at 0x7f2fe4033740> 2025-05-29 21:46:16.102 | └ "Help me study vocabulary: write a sentence for me to fill in the blank, and I'll try to pick the correct option." 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | File "/app/backend/open_webui/retrieval/utils.py", line 432, in generate_multiple 2025-05-29 21:46:16.102 | return func(query, prefix, user) 2025-05-29 21:46:16.102 | │ │ │ └ UserModel(id='bc952897-60c8-46d0-add9-5c1e45cebe45', name='Yogi', email='me@iyogeshjoshi.com', role='admin', profile_image_ur... 2025-05-29 21:46:16.102 | │ │ └ None 2025-05-29 21:46:16.102 | │ └ "Help me study vocabulary: write a sentence for me to fill in the blank, and I'll try to pick the correct option." 2025-05-29 21:46:16.102 | └ <function get_embedding_function.<locals>.<lambda> at 0x7f2fe40336a0> 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | File "/app/backend/open_webui/retrieval/utils.py", line 409, in <lambda> 2025-05-29 21:46:16.102 | func = lambda query, prefix=None, user=None: generate_embeddings( 2025-05-29 21:46:16.102 | │ └ <function generate_embeddings at 0x7f2fe6069a80> 2025-05-29 21:46:16.102 | └ "Help me study vocabulary: write a sentence for me to fill in the blank, and I'll try to pick the correct option." 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | File "/app/backend/open_webui/retrieval/utils.py", line 776, in generate_embeddings 2025-05-29 21:46:16.102 | embeddings = generate_ollama_batch_embeddings( 2025-05-29 21:46:16.102 | └ <function generate_ollama_batch_embeddings at 0x7f2fe60699e0> 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | > File "/app/backend/open_webui/retrieval/utils.py", line 734, in generate_ollama_batch_embeddings 2025-05-29 21:46:16.102 | r.raise_for_status() 2025-05-29 21:46:16.102 | │ └ <function Response.raise_for_status at 0x7f3028575120> 2025-05-29 21:46:16.102 | └ <Response [404]> 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1024, in raise_for_status 2025-05-29 21:46:16.102 | raise HTTPError(http_error_msg, response=self) 2025-05-29 21:46:16.102 | │ │ └ <Response [404]> 2025-05-29 21:46:16.102 | │ └ '404 Client Error: Not Found for url: http://host.docker.internal:11434/api/embed' 2025-05-29 21:46:16.102 | └ <class 'requests.exceptions.HTTPError'> 2025-05-29 21:46:16.102 | 2025-05-29 21:46:16.102 | requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://host.docker.internal:11434/api/embed 2025-05-29 21:46:16.115 | 2025-05-29 16:16:16.115 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "POST /api/chat/completions HTTP/1.1" 400 - {} 2025-05-29 21:46:16.145 | 2025-05-29 16:16:16.144 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.65.1:60273 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {} ``` Docker env vars ```bash "PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin", "LANG=C.UTF-8", "GPG_KEY=xxxxxxx", "PYTHON_VERSION=3.11.12", "PYTHON_SHA256=849da87af4df137710c1796e276a955f7a85c9f971081067c8f565d15c352a09", "ENV=prod", "PORT=8080", "USE_OLLAMA_DOCKER=false", "USE_CUDA_DOCKER=false", "USE_CUDA_DOCKER_VER=cu128", "USE_EMBEDDING_MODEL_DOCKER=sentence-transformers/all-MiniLM-L6-v2", "USE_RERANKING_MODEL_DOCKER=", "OLLAMA_BASE_URL=/ollama", "OPENAI_API_BASE_URL=", "OPENAI_API_KEY=", "WEBUI_SECRET_KEY=", "SCARF_NO_ANALYTICS=true", "DO_NOT_TRACK=true", "ANONYMIZED_TELEMETRY=false", "WHISPER_MODEL=base", "WHISPER_MODEL_DIR=/app/backend/data/cache/whisper/models", "RAG_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2", "RAG_RERANKING_MODEL=", "SENTENCE_TRANSFORMERS_HOME=/app/backend/data/cache/embedding/models", "TIKTOKEN_ENCODING_NAME=cl100k_base", "TIKTOKEN_CACHE_DIR=/app/backend/data/cache/tiktoken", "HF_HOME=/app/backend/data/cache/embedding/models", "HOME=/root", "WEBUI_BUILD_VERSION=ba0088f39b7a093920b142a5172554686f24df60", "DOCKER=true" ``` ### Additional Information Even with the complete new installation of open-webui, be it on `main` branch or `dev` branch, I'm getting the same error; the only thing that remains common is the volume used previously. Although I was previously able to obtain the list of Ollama models, after attempting to fix the broken functionality, that also stopped happening. I have been and still am very fond of open-webui, I use it primarily for everything, but some post-upgrading to the latest image has broken the functionality. Still keep up the good work, team. Let me know if anything more is required for debugging. Happy to help. PS: I have gone through the troubleshoot page and have tried updating the Ollama URL, tried both `localhost` and `127.0.0.1`, still nothing works. Though I can confirm the local ollama APIs are working fine, I have confirmed. UPDATE: After changing a few Docker configs, I was able to resolve the issue of models not listing, but the main issue, where it throws the above-specified error, still persists.
GiteaMirror added the bug label 2026-04-19 22:59:27 -05:00
Author
Owner

@rgaricano commented on GitHub (May 29, 2025):

Is alredy addressed, same as in https://github.com/open-webui/open-webui/issues/14421

<!-- gh-comment-id:2920277048 --> @rgaricano commented on GitHub (May 29, 2025): Is alredy addressed, same as in https://github.com/open-webui/open-webui/issues/14421
Author
Owner

@tjbck commented on GitHub (May 29, 2025):

Addressed with be989f3645

<!-- gh-comment-id:2920470578 --> @tjbck commented on GitHub (May 29, 2025): Addressed with be989f3645f2994af95a1ae9e58993da067ddde4
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#17276