issue with open-webui not listing the models of ollama #1690

Closed
opened 2025-11-11 14:50:12 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @ds47x on GitHub (Aug 5, 2024).

Bug Report

The issue is when trying to select a model the drop down menu says no results found

Description

The issue is i cant select or find llama models on the webui

i checked ollama if it is running

curl http://127.0.0.1:11434/

Ollama is running⏎ 
ollama list
NAME                            ID              SIZE    MODIFIED     
llama3.1:latest                 62757860e01    4.7 GB  2 hours ago 
llama2-uncensored:latest        44040b92233    3.8 GB  2 months ago

ollama works on the terminal but doesnt get detected on the webui

Steps to Reproduce:
1.install ollama
curl -fsSL https://ollama.com/install.sh | sh

install any model
ollama run llama3.1

then download webui using docker

sudo docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

and done but the models are not detected and it said server error when trying to download models by the webui

INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]

Expected Behavior:
what i expected to happen was download the webui and use the llama models on it

Actual Behavior:
the models are not listed on the webui

Environment

  • **Open WebUI Version:**v0.3.11
  • Ollama (if applicable): 0.1.39
  • Operating System: EndeavorsOS
  • **Browser (if applicable):firefox 128.0.3

Confirmation:

  • [ y] I have read and followed all the instructions provided in the README.md.
  • [ y] I am on the latest version of both Open WebUI and Ollama.
  • [ y] I have included the browser console logs.
  • [ y] I have included the Docker container logs.

Logs and Screenshots

ol
oll

Docker Container Logs:

ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
ERROR:    Exception in ASGI application
  + Exception Group Traceback (most recent call last):
  |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 87, in collapse_excgroups
  |     yield
  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 190, in __call__
  |     async with anyio.create_task_group() as task_group:
  |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 680, in __aexit__
  |     raise BaseExceptionGroup(
  | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
    |     result = await app(  # type: ignore[func-returns-value]
    |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    |     return await self.app(scope, receive, send)
    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    |     await super().__call__(scope, receive, send)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    |     await self.middleware_stack(scope, receive, send)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    |     raise exc
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    |     await self.app(scope, receive, _send)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    |     with collapse_excgroups():
    |   File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    |     self.gen.throw(typ, value, traceback)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    |     raise exc
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    |     response = await self.dispatch_func(request, call_next)
    |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/app/backend/main.py", line 902, in update_embedding_function
    |     response = await call_next(request)
    |                ^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 166, in call_next
    |     raise RuntimeError("No response returned.")
    | RuntimeError: No response returned.
    +------------------------------------

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    with collapse_excgroups():
  File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/main.py", line 902, in update_embedding_function
    response = await call_next(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 166, in call_next
    raise RuntimeError("No response returned.")
RuntimeError: No response returned.
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO:     172.17.0.1:52882 - "GET /_app/immutable/nodes/7.b46cc32c.js HTTP/1.1" 200 OK
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO:     172.17.0.1:52882 - "GET /api/v1/users/ HTTP/1.1" 200 OK
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO:     172.17.0.1:52894 - "GET /api/webhook HTTP/1.1" 200 OK
INFO:     172.17.0.1:52882 - "GET /api/v1/auths/admin/config HTTP/1.1" 200 OK
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO:     172.17.0.1:52882 - "GET /ollama/config HTTP/1.1" 200 OK
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO  [apps.ollama.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO:     172.17.0.1:52894 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error
INFO:     172.17.0.1:52882 - "GET /ollama/urls HTTP/1.1" 200 OK
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO  [apps.ollama.main] url: http://host.docker.internal:11434
INFO:     172.17.0.1:52882 - "POST /ollama/api/pull/0 HTTP/1.1" 500 Internal Server Error
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]
INFO  [apps.openai.main] get_all_models()
INFO  [apps.ollama.main] get_all_models()
ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused]


Installation Method

Docker

!

Originally created by @ds47x on GitHub (Aug 5, 2024). # Bug Report The issue is when trying to select a model the drop down menu says no results found ## Description The issue is i cant select or find llama models on the webui >i checked ollama if it is running ``` curl http://127.0.0.1:11434/ Ollama is running⏎ ``` ``` ollama list NAME ID SIZE MODIFIED llama3.1:latest 62757860e01 4.7 GB 2 hours ago llama2-uncensored:latest 44040b92233 3.8 GB 2 months ago ``` ollama works on the terminal but doesnt get detected on the webui **Steps to Reproduce:** 1.install ollama `curl -fsSL https://ollama.com/install.sh | sh` install any model `ollama run llama3.1` then download webui using docker `sudo docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main` and done but the models are not detected and it said server error when trying to download models by the webui ``` INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] ``` **Expected Behavior:** what i expected to happen was download the webui and use the llama models on it **Actual Behavior:** the models are not listed on the webui ## Environment - **Open WebUI Version:**v0.3.11 - **Ollama (if applicable):** 0.1.39 - **Operating System:** EndeavorsOS - **Browser (if applicable):firefox 128.0.3 **Confirmation:** - [ y] I have read and followed all the instructions provided in the README.md. - [ y] I am on the latest version of both Open WebUI and Ollama. - [ y] I have included the browser console logs. - [ y] I have included the Docker container logs. ## Logs and Screenshots ![ol](https://github.com/user-attachments/assets/fa9a457a-167f-40f6-85a9-92bc5e9f6c7c) ![oll](https://github.com/user-attachments/assets/9e372ca3-5c34-4991-8c41-2ea83ba155c4) **Docker Container Logs:** ``` ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] ERROR: Exception in ASGI application + Exception Group Traceback (most recent call last): | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 87, in collapse_excgroups | yield | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 190, in __call__ | async with anyio.create_task_group() as task_group: | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 680, in __aexit__ | raise BaseExceptionGroup( | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi | result = await app( # type: ignore[func-returns-value] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__ | return await self.app(scope, receive, send) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__ | await super().__call__(scope, receive, send) | File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__ | await self.middleware_stack(scope, receive, send) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__ | raise exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__ | await self.app(scope, receive, _send) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ | with collapse_excgroups(): | File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ | self.gen.throw(typ, value, traceback) | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups | raise exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ | response = await self.dispatch_func(request, call_next) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/app/backend/main.py", line 902, in update_embedding_function | response = await call_next(request) | ^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 166, in call_next | raise RuntimeError("No response returned.") | RuntimeError: No response returned. +------------------------------------ During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi result = await app( # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__ return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__ await super().__call__(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__ await self.middleware_stack(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__ raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__ await self.app(scope, receive, _send) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/main.py", line 902, in update_embedding_function response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 166, in call_next raise RuntimeError("No response returned.") RuntimeError: No response returned. INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO: 172.17.0.1:52882 - "GET /_app/immutable/nodes/7.b46cc32c.js HTTP/1.1" 200 OK INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO: 172.17.0.1:52882 - "GET /api/v1/users/ HTTP/1.1" 200 OK INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO: 172.17.0.1:52894 - "GET /api/webhook HTTP/1.1" 200 OK INFO: 172.17.0.1:52882 - "GET /api/v1/auths/admin/config HTTP/1.1" 200 OK INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO: 172.17.0.1:52882 - "GET /ollama/config HTTP/1.1" 200 OK INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO [apps.ollama.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO: 172.17.0.1:52894 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error INFO: 172.17.0.1:52882 - "GET /ollama/urls HTTP/1.1" 200 OK INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO [apps.ollama.main] url: http://host.docker.internal:11434 INFO: 172.17.0.1:52882 - "POST /ollama/api/pull/0 HTTP/1.1" 500 Internal Server Error INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] INFO [apps.openai.main] get_all_models() INFO [apps.ollama.main] get_all_models() ERROR [apps.ollama.main] Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection refused] ``` ## Installation Method Docker !
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1690