Unable to connect to Ollama #148

Closed
opened 2025-11-11 14:08:16 -06:00 by GiteaMirror · 19 comments
Owner

Originally created by @netphantom on GitHub (Jan 4, 2024).

Bug Report

Description

Bug Summary:
I am unable to connect the UI to Ollama server, despite it is running. The UI returns

Ollama Web UI
Ollama WebUI: Server Connection Error
Ollama Version: Not Detected

Connection Issue or Update Needed
Oops! It seems like your Ollama needs a little attention.
We've detected either a connection hiccup or observed that you're using an older version. Ensure you're on the latest Ollama version
(version 0.1.16 or higher) or check your connection.
Trouble accessing Ollama? Click here for help.

Steps to Reproduce:
run on a terminal
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main

Then register with the email

Expected Behavior:
Login

Actual Behavior:
Connection to Ollama error

Environment

  • Operating System: Manjaro
  • Browser (if applicable): Firefox 121 - Chromium 120

Reproduction Details

Confirmation:

  • [ X] I have read and followed all the instructions provided in the README.md.
  • [ X] I have reviewed the troubleshooting.md document.
  • I have included the browser console logs.
  • [X ] I have included the Docker container logs.

Logs and Screenshots

Docker Container Logs:

INFO:     Started server process [6]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
INFO:     172.17.0.1:34062 - "GET /auth HTTP/1.1" 304 Not Modified
INFO:     172.17.0.1:34062 - "GET / HTTP/1.1" 200 OK
(trapped) error reading bcrypt version
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/passlib/handlers/bcrypt.py", line 620, in _load_backend_mixin
    version = _bcrypt.__about__.__version__
              ^^^^^^^^^^^^^^^^^
AttributeError: module 'bcrypt' has no attribute '__about__'
insert_new_auth
INFO:     172.17.0.1:43244 - "POST /auths/signup HTTP/1.1" 200 OK
INFO:     172.17.0.1:43244 - "GET /modelfiles/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:43244 - "GET /prompts/ HTTP/1.1" 200 OK
http://host.docker.internal:11434/api/tags
http://host.docker.internal:11434/api/version
HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f01d0187090>: Failed to establish a new connection: [Errno 111] Connection refused'))
HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/version (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f01d01c3710>: Failed to establish a new connection: [Errno 111] Connection refused'))
INFO:     172.17.0.1:43244 - "GET /tags HTTP/1.1" 400 Bad Request
INFO:     172.17.0.1:43250 - "GET /version HTTP/1.1" 400 Bad Request
http://host.docker.internal:11434/api/version
INFO:     172.17.0.1:43250 - "GET /chats/ HTTP/1.1" 200 OK
HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/version (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f01d0315e90>: Failed to establish a new connection: [Errno 111] Connection refused'))
INFO:     172.17.0.1:43244 - "GET /version HTTP/1.1" 400 Bad Request

Screenshots (if applicable):
immagine

Installation Method

Docker (image downloaded)

Additional Information

  • Skipping to the settings page and change the Ollama API endpoint doesn't fix the problem
  • Running the docker command with the OLLAMA_API_BASE_URL doesn't fix the problem
  • Changing the network to host doesn't fix the problem
  • Ollama server is running at http://127.0.0.1:11434/ and responding to curls as well as ollama run enters in the console

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @netphantom on GitHub (Jan 4, 2024). # Bug Report ## Description **Bug Summary:** I am unable to connect the UI to Ollama server, despite it is running. The UI returns Ollama Web UI Ollama WebUI: Server Connection Error Ollama Version: Not Detected Connection Issue or Update Needed Oops! It seems like your Ollama needs a little attention. We've detected either a connection hiccup or observed that you're using an older version. Ensure you're on the latest Ollama version (version 0.1.16 or higher) or check your connection. Trouble accessing Ollama? [Click here for help.](https://github.com/ollama-webui/ollama-webui#troubleshooting) **Steps to Reproduce:** run on a terminal `docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main` Then register with the email **Expected Behavior:** Login **Actual Behavior:** Connection to Ollama error ## Environment - **Operating System:** Manjaro - **Browser (if applicable):** Firefox 121 - Chromium 120 ## Reproduction Details **Confirmation:** - [ X] I have read and followed all the instructions provided in the README.md. - [ X] I have reviewed the troubleshooting.md document. - [ ] I have included the browser console logs. - [X ] I have included the Docker container logs. ## Logs and Screenshots **Docker Container Logs:** ```shell INFO: Started server process [6] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit) INFO: 172.17.0.1:34062 - "GET /auth HTTP/1.1" 304 Not Modified INFO: 172.17.0.1:34062 - "GET / HTTP/1.1" 200 OK (trapped) error reading bcrypt version Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/passlib/handlers/bcrypt.py", line 620, in _load_backend_mixin version = _bcrypt.__about__.__version__ ^^^^^^^^^^^^^^^^^ AttributeError: module 'bcrypt' has no attribute '__about__' insert_new_auth INFO: 172.17.0.1:43244 - "POST /auths/signup HTTP/1.1" 200 OK INFO: 172.17.0.1:43244 - "GET /modelfiles/ HTTP/1.1" 200 OK INFO: 172.17.0.1:43244 - "GET /prompts/ HTTP/1.1" 200 OK http://host.docker.internal:11434/api/tags http://host.docker.internal:11434/api/version HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f01d0187090>: Failed to establish a new connection: [Errno 111] Connection refused')) HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/version (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f01d01c3710>: Failed to establish a new connection: [Errno 111] Connection refused')) INFO: 172.17.0.1:43244 - "GET /tags HTTP/1.1" 400 Bad Request INFO: 172.17.0.1:43250 - "GET /version HTTP/1.1" 400 Bad Request http://host.docker.internal:11434/api/version INFO: 172.17.0.1:43250 - "GET /chats/ HTTP/1.1" 200 OK HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/version (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f01d0315e90>: Failed to establish a new connection: [Errno 111] Connection refused')) INFO: 172.17.0.1:43244 - "GET /version HTTP/1.1" 400 Bad Request ``` **Screenshots (if applicable):** ![immagine](https://github.com/ollama-webui/ollama-webui/assets/13152948/24dce1e9-afff-4d2c-8da0-ce574ea1b908) ## Installation Method Docker (image downloaded) ## Additional Information * Skipping to the settings page and change the Ollama API endpoint doesn't fix the problem * Running the docker command with the `OLLAMA_API_BASE_URL` doesn't fix the problem * Changing the network to `host` doesn't fix the problem * Ollama server is running at `http://127.0.0.1:11434/` and responding to curls as well as ollama run enters in the console ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@tjbck commented on GitHub (Jan 4, 2024):

Hi! Just to confirm,

docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main

command didn't work for you after removing the old webui docker instance (docker rm -f ollama-webui)?

@tjbck commented on GitHub (Jan 4, 2024): Hi! Just to confirm, ``` docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main ``` command didn't work for you after removing the old webui docker instance (`docker rm -f ollama-webui`)?
Author
Owner

@netphantom commented on GitHub (Jan 4, 2024):

Hi @tjbck , I just tried the command and the answer is no. I deleted volumes, image, container and everything. After re-signing up, the error persists.

Please find attached the docker logs

INFO:     Started server process [7]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
INFO:     127.0.0.1:45330 - "GET / HTTP/1.1" 200 OK
INFO:     127.0.0.1:45330 - "GET /_app/immutable/entry/start.ca1bee60.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45338 - "GET /_app/immutable/chunks/scheduler.0545783f.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45346 - "GET /_app/immutable/chunks/singletons.4d2bb212.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45366 - "GET /_app/immutable/entry/app.611f056c.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET /_app/immutable/chunks/index.a5765a96.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45350 - "GET /_app/immutable/chunks/index.323ffc89.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45330 - "GET /_app/immutable/assets/constants.3a6d0da3.css HTTP/1.1" 200 OK
INFO:     127.0.0.1:45366 - "GET /_app/immutable/assets/0.7aad91b4.css HTTP/1.1" 200 OK
INFO:     127.0.0.1:45346 - "GET /_app/immutable/chunks/index.b8e43179.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET /_app/immutable/nodes/0.ec213209.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45338 - "GET /_app/immutable/chunks/navigation.31587a63.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45350 - "GET /_app/immutable/chunks/constants.2c7fe818.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45366 - "GET /_app/immutable/chunks/each.3e33e1b7.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45330 - "GET /_app/immutable/chunks/index.1e767456.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET /_app/immutable/nodes/2.318ce0b0.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45338 - "GET /_app/immutable/nodes/1.7e266ce5.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45346 - "GET /_app/immutable/chunks/stores.51ca8069.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45350 - "GET /_app/immutable/chunks/FileSaver.min.898eb36f.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45366 - "GET /_app/immutable/chunks/_commonjsHelpers.de833af9.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45330 - "GET /_app/immutable/chunks/index.93390239.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45338 - "GET /_app/immutable/chunks/index.fbad1410.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45346 - "GET /_app/immutable/chunks/index.81f274ff.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45350 - "GET /_app/immutable/chunks/index.6f7e0cd4.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET /_app/immutable/chunks/index.7fdf4701.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45366 - "GET /_app/immutable/chunks/Advanced.8e7ace2b.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45330 - "GET /_app/immutable/nodes/3.6188be87.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45338 - "GET /_app/immutable/assets/2.f48dc938.css HTTP/1.1" 200 OK
INFO:     127.0.0.1:45346 - "GET /_app/immutable/assets/Navbar.e3b04202.css HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET /favicon.png HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET /_app/immutable/chunks/Navbar.28918aaf.js HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:45366 - "GET /themes/rosepine-dawn.css HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET /themes/rosepine.css HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET / HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET /auths/ HTTP/1.1" 401 Unauthorized
INFO:     127.0.0.1:45368 - "GET /_app/immutable/nodes/12.d8473817.js HTTP/1.1" 200 OK
INFO:     127.0.0.1:45366 - "GET /_app/immutable/assets/12.e43bb62b.css HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET /assets/fonts/Arimo-Variable.ttf HTTP/1.1" 304 Not Modified
INFO:     127.0.0.1:45366 - "GET /assets/fonts/Mona-Sans.woff2 HTTP/1.1" 200 OK
INFO:     127.0.0.1:45368 - "GET /ollama.png HTTP/1.1" 200 OK
(trapped) error reading bcrypt version
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/passlib/handlers/bcrypt.py", line 620, in _load_backend_mixin
    version = _bcrypt.__about__.__version__
              ^^^^^^^^^^^^^^^^^
AttributeError: module 'bcrypt' has no attribute '__about__'
insert_new_auth
INFO:     127.0.0.1:60088 - "POST /auths/signup HTTP/1.1" 200 OK
INFO:     127.0.0.1:60088 - "GET /modelfiles/ HTTP/1.1" 200 OK
INFO:     127.0.0.1:60088 - "GET /prompts/ HTTP/1.1" 200 OK
INFO:     127.0.0.1:60098 - "GET /api//version HTTP/1.1" 200 OK
INFO:     127.0.0.1:60088 - "GET /api//tags HTTP/1.1" 200 OK
INFO:     127.0.0.1:60098 - "GET /chats/ HTTP/1.1" 200 OK
@netphantom commented on GitHub (Jan 4, 2024): Hi @tjbck , I just tried the command and the answer is no. I deleted volumes, image, container and everything. After re-signing up, the error persists. Please find attached the docker logs ```shell INFO: Started server process [7] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit) INFO: 127.0.0.1:45330 - "GET / HTTP/1.1" 200 OK INFO: 127.0.0.1:45330 - "GET /_app/immutable/entry/start.ca1bee60.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45338 - "GET /_app/immutable/chunks/scheduler.0545783f.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45346 - "GET /_app/immutable/chunks/singletons.4d2bb212.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45366 - "GET /_app/immutable/entry/app.611f056c.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET /_app/immutable/chunks/index.a5765a96.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45350 - "GET /_app/immutable/chunks/index.323ffc89.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45330 - "GET /_app/immutable/assets/constants.3a6d0da3.css HTTP/1.1" 200 OK INFO: 127.0.0.1:45366 - "GET /_app/immutable/assets/0.7aad91b4.css HTTP/1.1" 200 OK INFO: 127.0.0.1:45346 - "GET /_app/immutable/chunks/index.b8e43179.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET /_app/immutable/nodes/0.ec213209.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45338 - "GET /_app/immutable/chunks/navigation.31587a63.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45350 - "GET /_app/immutable/chunks/constants.2c7fe818.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45366 - "GET /_app/immutable/chunks/each.3e33e1b7.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45330 - "GET /_app/immutable/chunks/index.1e767456.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET /_app/immutable/nodes/2.318ce0b0.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45338 - "GET /_app/immutable/nodes/1.7e266ce5.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45346 - "GET /_app/immutable/chunks/stores.51ca8069.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45350 - "GET /_app/immutable/chunks/FileSaver.min.898eb36f.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45366 - "GET /_app/immutable/chunks/_commonjsHelpers.de833af9.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45330 - "GET /_app/immutable/chunks/index.93390239.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45338 - "GET /_app/immutable/chunks/index.fbad1410.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45346 - "GET /_app/immutable/chunks/index.81f274ff.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45350 - "GET /_app/immutable/chunks/index.6f7e0cd4.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET /_app/immutable/chunks/index.7fdf4701.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45366 - "GET /_app/immutable/chunks/Advanced.8e7ace2b.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45330 - "GET /_app/immutable/nodes/3.6188be87.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45338 - "GET /_app/immutable/assets/2.f48dc938.css HTTP/1.1" 200 OK INFO: 127.0.0.1:45346 - "GET /_app/immutable/assets/Navbar.e3b04202.css HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET /favicon.png HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET /_app/immutable/chunks/Navbar.28918aaf.js HTTP/1.1" 304 Not Modified INFO: 127.0.0.1:45366 - "GET /themes/rosepine-dawn.css HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET /themes/rosepine.css HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET / HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET /auths/ HTTP/1.1" 401 Unauthorized INFO: 127.0.0.1:45368 - "GET /_app/immutable/nodes/12.d8473817.js HTTP/1.1" 200 OK INFO: 127.0.0.1:45366 - "GET /_app/immutable/assets/12.e43bb62b.css HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET /assets/fonts/Arimo-Variable.ttf HTTP/1.1" 304 Not Modified INFO: 127.0.0.1:45366 - "GET /assets/fonts/Mona-Sans.woff2 HTTP/1.1" 200 OK INFO: 127.0.0.1:45368 - "GET /ollama.png HTTP/1.1" 200 OK (trapped) error reading bcrypt version Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/passlib/handlers/bcrypt.py", line 620, in _load_backend_mixin version = _bcrypt.__about__.__version__ ^^^^^^^^^^^^^^^^^ AttributeError: module 'bcrypt' has no attribute '__about__' insert_new_auth INFO: 127.0.0.1:60088 - "POST /auths/signup HTTP/1.1" 200 OK INFO: 127.0.0.1:60088 - "GET /modelfiles/ HTTP/1.1" 200 OK INFO: 127.0.0.1:60088 - "GET /prompts/ HTTP/1.1" 200 OK INFO: 127.0.0.1:60098 - "GET /api//version HTTP/1.1" 200 OK INFO: 127.0.0.1:60088 - "GET /api//tags HTTP/1.1" 200 OK INFO: 127.0.0.1:60098 - "GET /chats/ HTTP/1.1" 200 OK ```
Author
Owner

@tjbck commented on GitHub (Jan 4, 2024):

Hmm, could you also provide the browser console logs with us? Thanks!

@tjbck commented on GitHub (Jan 4, 2024): Hmm, could you also provide the browser console logs with us? Thanks!
Author
Owner

@tjbck commented on GitHub (Jan 4, 2024):

/api//tags

should've been

/tags
image

in the backend logs.

so please make sure that the ollama api url is set to /ollama/api in the webui settings as well!

image
@tjbck commented on GitHub (Jan 4, 2024): > ```shell > /api//tags > ``` should've been > ```shell > /tags > ``` <img width="828" alt="image" src="https://github.com/ollama-webui/ollama-webui/assets/25473318/54369a9a-fe8b-4a38-8a89-86312be85baf"> in the backend logs. so please make sure that the ollama api url is set to `/ollama/api` in the webui settings as well! <img width="642" alt="image" src="https://github.com/ollama-webui/ollama-webui/assets/25473318/d7df279d-6c62-4828-b94f-7f9ef080ea2e">
Author
Owner

@netphantom commented on GitHub (Jan 4, 2024):

I thought for a moment that it worked, but I am getting this error on the backend

INFO:     127.0.0.1:34362 - "GET /ollama.png HTTP/1.1" 200 OK

http://127.0.0.1:11434/ollama/api/version

404 Client Error: Not Found for url: http://127.0.0.1:11434/ollama/api/version

404 page not found

INFO:     127.0.0.1:34362 - "GET /chats/ HTTP/1.1" 200 OK

[2024-01-04 19:20:01,790] ERROR in app: Exception on /version [GET]

Traceback (most recent call last):

  File "/app/backend/apps/ollama/main.py", line 83, in proxy

    r.raise_for_status()

  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status

    raise HTTPError(http_error_msg, response=self)

requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://127.0.0.1:11434/ollama/api/version

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 971, in json

    return complexjson.loads(self.text, **kwargs)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/json/__init__.py", line 346, in loads

    return _default_decoder.decode(s)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/json/decoder.py", line 340, in decode

    raise JSONDecodeError("Extra data", s, end)

json.decoder.JSONDecodeError: Extra data: line 1 column 5 (char 4)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 1455, in wsgi_app

    response = self.full_dispatch_request()

               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 869, in full_dispatch_request

    rv = self.handle_user_exception(e)

         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/flask_cors/extension.py", line 176, in wrapped_function

    return cors_after_request(app.make_response(f(*args, **kwargs)))

                                                ^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 867, in full_dispatch_request

    rv = self.dispatch_request()

         ^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 852, in dispatch_request

    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/backend/apps/ollama/main.py", line 102, in proxy

    res = r.json()

          ^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 975, in json

    raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)

requests.exceptions.JSONDecodeError: Extra data: line 1 column 5 (char 4)

INFO:     127.0.0.1:34368 - "GET /version HTTP/1.1" 500 Internal Server Error

http://127.0.0.1:11434/ollama/api/tags

404 Client Error: Not Found for url: http://127.0.0.1:11434/ollama/api/tags

404 page not found

[2024-01-04 19:20:08,679] ERROR in app: Exception on /tags [GET]

Traceback (most recent call last):

  File "/app/backend/apps/ollama/main.py", line 83, in proxy

    r.raise_for_status()

  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status

    raise HTTPError(http_error_msg, response=self)

requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://127.0.0.1:11434/ollama/api/tags

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 971, in json

    return complexjson.loads(self.text, **kwargs)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/json/__init__.py", line 346, in loads

    return _default_decoder.decode(s)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/json/decoder.py", line 340, in decode

    raise JSONDecodeError("Extra data", s, end)

json.decoder.JSONDecodeError: Extra data: line 1 column 5 (char 4)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 1455, in wsgi_app

    response = self.full_dispatch_request()

               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 869, in full_dispatch_request

    rv = self.handle_user_exception(e)

         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/flask_cors/extension.py", line 176, in wrapped_function

    return cors_after_request(app.make_response(f(*args, **kwargs)))

                                                ^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 867, in full_dispatch_request

    rv = self.dispatch_request()

         ^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 852, in dispatch_request

    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/backend/apps/ollama/main.py", line 102, in proxy

    res = r.json()

          ^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 975, in json

    raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)

requests.exceptions.JSONDecodeError: Extra data: line 1 column 5 (char 4)

INFO:     127.0.0.1:39236 - "GET /tags HTTP/1.1" 500 Internal Server Error

On the Firefox Log I see these:

Object { status: true, version: "v1.0.0-alpha.42", auth: true, default_models: null }
[0.ec213209.js:1:18264](http://localhost:8080/_app/immutable/nodes/0.ec213209.js)
Object { detail: "Your session has expired or the token is invalid. Please sign in again." }
[index.1e767456.js:1:277](http://localhost:8080/_app/immutable/chunks/index.1e767456.js)
Object { id: "8f3ff642-1964-4b16-8484-bf82f94b0a58", email: "test@test.com", name: "Test", role: "admin", profile_image_url: "https://www.gravatar.com/avatar/f660ab912ec121d1b1e928a0bb4bc61b15f5ad44d5efdc4e1c92a25e99b8e44a?d=mp", token: "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJlbWFpbCI6InRlc3RAdGVzdC5jb20ifQ.Sl_ErLdM1pdagmu0N1uZly1zevHGEpc15l4RuSmyIC4", token_type: "Bearer" }
[12.d8473817.js:1:5503](http://localhost:8080/_app/immutable/nodes/12.d8473817.js)
IDB Not Found [2.318ce0b0.js:53:3042](http://localhost:8080/_app/immutable/nodes/2.318ce0b0.js)
[2.318ce0b0.js:53:3071](http://localhost:8080/_app/immutable/nodes/2.318ce0b0.js)
Array []
[2.318ce0b0.js:53:3236](http://localhost:8080/_app/immutable/nodes/2.318ce0b0.js)
SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data [index.93390239.js:1:704](http://localhost:8080/_app/immutable/chunks/index.93390239.js)
SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data [index.93390239.js:1:320](http://localhost:8080/_app/immutable/chunks/index.93390239.js)
<empty string> [2.318ce0b0.js:53:2668](http://localhost:8080/_app/immutable/nodes/2.318ce0b0.js)
Object { API_BASE_URL: "/ollama/api" }
[2.318ce0b0.js:37:1425](http://localhost:8080/_app/immutable/nodes/2.318ce0b0.js)
initNewChat [3.6188be87.js:1:5470](http://localhost:8080/_app/immutable/nodes/3.6188be87.js)
<empty string> [3.6188be87.js:1:5514](http://localhost:8080/_app/immutable/nodes/3.6188be87.js)
Object { status: true, version: "v1.0.0-alpha.42", auth: true, default_models: null }
[3.6188be87.js:1:5595](http://localhost:8080/_app/immutable/nodes/3.6188be87.js)
SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data [index.93390239.js:1:320](http://localhost:8080/_app/immutable/chunks/index.93390239.js)

Ollama server is running, as a matter of fact, if I run

~  ollama serve
Error: listen tcp 127.0.0.1:11434: bind: address already in use
@netphantom commented on GitHub (Jan 4, 2024): I thought for a moment that it worked, but I am getting this error on the backend ```python INFO: 127.0.0.1:34362 - "GET /ollama.png HTTP/1.1" 200 OK http://127.0.0.1:11434/ollama/api/version 404 Client Error: Not Found for url: http://127.0.0.1:11434/ollama/api/version 404 page not found INFO: 127.0.0.1:34362 - "GET /chats/ HTTP/1.1" 200 OK [2024-01-04 19:20:01,790] ERROR in app: Exception on /version [GET] Traceback (most recent call last): File "/app/backend/apps/ollama/main.py", line 83, in proxy r.raise_for_status() File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://127.0.0.1:11434/ollama/api/version During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 971, in json return complexjson.loads(self.text, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/json/__init__.py", line 346, in loads return _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/json/decoder.py", line 340, in decode raise JSONDecodeError("Extra data", s, end) json.decoder.JSONDecodeError: Extra data: line 1 column 5 (char 4) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 1455, in wsgi_app response = self.full_dispatch_request() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 869, in full_dispatch_request rv = self.handle_user_exception(e) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/flask_cors/extension.py", line 176, in wrapped_function return cors_after_request(app.make_response(f(*args, **kwargs))) ^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 867, in full_dispatch_request rv = self.dispatch_request() ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 852, in dispatch_request return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/apps/ollama/main.py", line 102, in proxy res = r.json() ^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 975, in json raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) requests.exceptions.JSONDecodeError: Extra data: line 1 column 5 (char 4) INFO: 127.0.0.1:34368 - "GET /version HTTP/1.1" 500 Internal Server Error http://127.0.0.1:11434/ollama/api/tags 404 Client Error: Not Found for url: http://127.0.0.1:11434/ollama/api/tags 404 page not found [2024-01-04 19:20:08,679] ERROR in app: Exception on /tags [GET] Traceback (most recent call last): File "/app/backend/apps/ollama/main.py", line 83, in proxy r.raise_for_status() File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://127.0.0.1:11434/ollama/api/tags During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 971, in json return complexjson.loads(self.text, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/json/__init__.py", line 346, in loads return _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/json/decoder.py", line 340, in decode raise JSONDecodeError("Extra data", s, end) json.decoder.JSONDecodeError: Extra data: line 1 column 5 (char 4) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 1455, in wsgi_app response = self.full_dispatch_request() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 869, in full_dispatch_request rv = self.handle_user_exception(e) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/flask_cors/extension.py", line 176, in wrapped_function return cors_after_request(app.make_response(f(*args, **kwargs))) ^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 867, in full_dispatch_request rv = self.dispatch_request() ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 852, in dispatch_request return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/apps/ollama/main.py", line 102, in proxy res = r.json() ^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 975, in json raise RequestsJSONDecodeError(e.msg, e.doc, e.pos) requests.exceptions.JSONDecodeError: Extra data: line 1 column 5 (char 4) INFO: 127.0.0.1:39236 - "GET /tags HTTP/1.1" 500 Internal Server Error ``` On the Firefox Log I see these: ``` Object { status: true, version: "v1.0.0-alpha.42", auth: true, default_models: null } [0.ec213209.js:1:18264](http://localhost:8080/_app/immutable/nodes/0.ec213209.js) Object { detail: "Your session has expired or the token is invalid. Please sign in again." } [index.1e767456.js:1:277](http://localhost:8080/_app/immutable/chunks/index.1e767456.js) Object { id: "8f3ff642-1964-4b16-8484-bf82f94b0a58", email: "test@test.com", name: "Test", role: "admin", profile_image_url: "https://www.gravatar.com/avatar/f660ab912ec121d1b1e928a0bb4bc61b15f5ad44d5efdc4e1c92a25e99b8e44a?d=mp", token: "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJlbWFpbCI6InRlc3RAdGVzdC5jb20ifQ.Sl_ErLdM1pdagmu0N1uZly1zevHGEpc15l4RuSmyIC4", token_type: "Bearer" } [12.d8473817.js:1:5503](http://localhost:8080/_app/immutable/nodes/12.d8473817.js) IDB Not Found [2.318ce0b0.js:53:3042](http://localhost:8080/_app/immutable/nodes/2.318ce0b0.js) [2.318ce0b0.js:53:3071](http://localhost:8080/_app/immutable/nodes/2.318ce0b0.js) Array [] [2.318ce0b0.js:53:3236](http://localhost:8080/_app/immutable/nodes/2.318ce0b0.js) SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data [index.93390239.js:1:704](http://localhost:8080/_app/immutable/chunks/index.93390239.js) SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data [index.93390239.js:1:320](http://localhost:8080/_app/immutable/chunks/index.93390239.js) <empty string> [2.318ce0b0.js:53:2668](http://localhost:8080/_app/immutable/nodes/2.318ce0b0.js) Object { API_BASE_URL: "/ollama/api" } [2.318ce0b0.js:37:1425](http://localhost:8080/_app/immutable/nodes/2.318ce0b0.js) initNewChat [3.6188be87.js:1:5470](http://localhost:8080/_app/immutable/nodes/3.6188be87.js) <empty string> [3.6188be87.js:1:5514](http://localhost:8080/_app/immutable/nodes/3.6188be87.js) Object { status: true, version: "v1.0.0-alpha.42", auth: true, default_models: null } [3.6188be87.js:1:5595](http://localhost:8080/_app/immutable/nodes/3.6188be87.js) SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data [index.93390239.js:1:320](http://localhost:8080/_app/immutable/chunks/index.93390239.js) ``` Ollama server is running, as a matter of fact, if I run ```shell ~ ollama serve Error: listen tcp 127.0.0.1:11434: bind: address already in use ```
Author
Owner

@tjbck commented on GitHub (Jan 4, 2024):

Seems like you might've set your OLLAMA_API_BASE_URL env var to http://127.0.0.1:11434/ollama/api, could you verify that it's been set to http://127.0.0.1:11434/api?

@tjbck commented on GitHub (Jan 4, 2024): Seems like you might've set your `OLLAMA_API_BASE_URL` env var to `http://127.0.0.1:11434/ollama/api`, could you verify that it's been set to `http://127.0.0.1:11434/api`?
Author
Owner

@tjbck commented on GitHub (Jan 5, 2024):

If you update to the latest release, the settings should look something like this:

image

@tjbck commented on GitHub (Jan 5, 2024): If you update to the latest release, the settings should look something like this: ![image](https://github.com/ollama-webui/ollama-webui/assets/25473318/1263a19a-ff63-4502-b725-6d441f37f07f)
Author
Owner

@netphantom commented on GitHub (Jan 5, 2024):

I Think I got it working.
I purged docker images and run the new one with:
docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main

The new screen actually shows the correct base url: http://127.0.0.1:11434/api

Thanks for the support! :)

@netphantom commented on GitHub (Jan 5, 2024): I Think I got it working. I purged docker images and run the new one with: `docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main` The new screen actually shows the correct base url: `http://127.0.0.1:11434/api` Thanks for the support! :)
Author
Owner

@tjbck commented on GitHub (Jan 5, 2024):

Glad it's working for you now!

@tjbck commented on GitHub (Jan 5, 2024): Glad it's working for you now!
Author
Owner

@maurimv commented on GitHub (May 1, 2024):

I Think I got it working. I purged docker images and run the new one with: docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main

The new screen actually shows the correct base url: http://127.0.0.1:11434/api

Thanks for the support! :)

Thanks, works for me!

@maurimv commented on GitHub (May 1, 2024): > I Think I got it working. I purged docker images and run the new one with: `docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main` > > The new screen actually shows the correct base url: `http://127.0.0.1:11434/api` > > Thanks for the support! :) Thanks, works for me!
Author
Owner

@justinh-rahb commented on GitHub (May 1, 2024):

I Think I got it working. I purged docker images and run the new one with: docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
The new screen actually shows the correct base url: http://127.0.0.1:11434/api
Thanks for the support! :)

Thanks, works for me!

That's old advice. The Ollama URL variable values should no longer have /api on the end, this may cause problems.

@justinh-rahb commented on GitHub (May 1, 2024): > > I Think I got it working. I purged docker images and run the new one with: `docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main` > > The new screen actually shows the correct base url: `http://127.0.0.1:11434/api` > > Thanks for the support! :) > > Thanks, works for me! That's old advice. The Ollama URL variable values should no longer have `/api` on the end, this may cause problems.
Author
Owner

@hemangjoshi37a commented on GitHub (May 6, 2024):

hi working in ollama-webui but no working in open-webui http://127.0.0.1:11434/api

@hemangjoshi37a commented on GitHub (May 6, 2024): hi working in ollama-webui but no working in open-webui `http://127.0.0.1:11434/api`
Author
Owner

@ahmetcanisik commented on GitHub (Jun 5, 2024):

When I tried http://localhost:8080, webui finally saw my ollama models.

@ahmetcanisik commented on GitHub (Jun 5, 2024): When I tried http://localhost:8080, webui finally saw my ollama models.
Author
Owner

@ccarrez commented on GitHub (Aug 14, 2024):

netphantom found a solution that shows the problem is related to Docker network

It is working by using the host network
However, in this case, you cannot use port redirection -p 3000:8080

sudo docker run -d -p --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Open Web UI is still not working as intended, cf. readme installation guide

@ccarrez commented on GitHub (Aug 14, 2024): [netphantom](https://github.com/netphantom) found a solution that shows the problem is related to Docker network It is working by using the **host** network However, in this case, you cannot use port redirection -p 3000:8080 ```bash sudo docker run -d -p --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` Open Web UI is still not working as intended, cf. readme installation guide
Author
Owner

@laurentperez commented on GitHub (Aug 30, 2024):

sorry to bump on this closed one but @ccarrez is right. running the container on the host network will prevent port forwarding.

when you aready run local containers on port 8080 this is not possible.

EDIT you need to run open-webui directly without port redirecting :

docker run --network=host -e WEBUI_AUTH=false -e PORT=3000 -e OLLAMA_BASE_URL=http://xxxx

@laurentperez commented on GitHub (Aug 30, 2024): sorry to bump on this closed one but @ccarrez is right. running the container on the host network will prevent port forwarding. when you aready run local containers on port 8080 this is not possible. EDIT you need to run open-webui directly without port redirecting : `docker run --network=host -e WEBUI_AUTH=false -e PORT=3000 -e OLLAMA_BASE_URL=http://xxxx`
Author
Owner

@hemangjoshi37a commented on GitHub (Aug 31, 2024):

now it is not even starting and giving this error in the docker container logs. :


      
    |     await super().__call__(scope, receive, send)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    |     await self.middleware_stack(scope, receive, send)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    |     raise exc
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    |     await self.app(scope, receive, _send)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    |     with collapse_excgroups():
    |   File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    |     self.gen.throw(typ, value, traceback)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    |     raise exc
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    |     response = await self.dispatch_func(request, call_next)
    |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/app/backend/main.py", line 790, in update_embedding_function
    |     response = await call_next(request)
    |                ^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next
    |     raise app_exc
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
    |     await self.app(scope, receive_or_disconnect, send_no_error)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    |     with collapse_excgroups():
    |   File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    |     self.gen.throw(typ, value, traceback)
    |   File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    |     raise exc
    |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    |     response = await self.dispatch_func(request, call_next)
    |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/app/backend/main.py", line 776, in check_url
    |     await get_all_models()
    |   File "/app/backend/main.py", line 823, in get_all_models
    |     ollama_models = await get_ollama_models()
    |                     ^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/app/backend/apps/ollama/main.py", line 235, in get_all_models
    |     app.state.MODELS = {model["model"]: model for model in models["models"]}
    |                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    |   File "/app/backend/apps/ollama/main.py", line 235, in <dictcomp>
    |     app.state.MODELS = {model["model"]: model for model in models["models"]}
    |                         ~~~~~^^^^^^^^^
    | KeyError: 'model'
    +------------------------------------
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    with collapse_excgroups():
  File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/main.py", line 790, in update_embedding_function
    response = await call_next(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next
    raise app_exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    with collapse_excgroups():
  File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/main.py", line 776, in check_url
    await get_all_models()
  File "/app/backend/main.py", line 823, in get_all_models
    ollama_models = await get_ollama_models()
                    ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/apps/ollama/main.py", line 235, in get_all_models
    app.state.MODELS = {model["model"]: model for model in models["models"]}
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/apps/ollama/main.py", line 235, in <dictcomp>
    app.state.MODELS = {model["model"]: model for model in models["models"]}
                        ~~~~~^^^^^^^^^
KeyError: 'model'
@hemangjoshi37a commented on GitHub (Aug 31, 2024): now it is not even starting and giving this error in the docker container logs. : ```sh | await super().__call__(scope, receive, send) | File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__ | await self.middleware_stack(scope, receive, send) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__ | raise exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__ | await self.app(scope, receive, _send) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ | with collapse_excgroups(): | File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ | self.gen.throw(typ, value, traceback) | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups | raise exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ | response = await self.dispatch_func(request, call_next) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/app/backend/main.py", line 790, in update_embedding_function | response = await call_next(request) | ^^^^^^^^^^^^^^^^^^^^^^^^ | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next | raise app_exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro | await self.app(scope, receive_or_disconnect, send_no_error) | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ | with collapse_excgroups(): | File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ | self.gen.throw(typ, value, traceback) | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups | raise exc | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ | response = await self.dispatch_func(request, call_next) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/app/backend/main.py", line 776, in check_url | await get_all_models() | File "/app/backend/main.py", line 823, in get_all_models | ollama_models = await get_ollama_models() | ^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/app/backend/apps/ollama/main.py", line 235, in get_all_models | app.state.MODELS = {model["model"]: model for model in models["models"]} | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/app/backend/apps/ollama/main.py", line 235, in <dictcomp> | app.state.MODELS = {model["model"]: model for model in models["models"]} | ~~~~~^^^^^^^^^ | KeyError: 'model' +------------------------------------ During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi result = await app( # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__ return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__ await super().__call__(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__ await self.middleware_stack(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__ raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__ await self.app(scope, receive, _send) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/main.py", line 790, in update_embedding_function response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/main.py", line 776, in check_url await get_all_models() File "/app/backend/main.py", line 823, in get_all_models ollama_models = await get_ollama_models() ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/apps/ollama/main.py", line 235, in get_all_models app.state.MODELS = {model["model"]: model for model in models["models"]} ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/apps/ollama/main.py", line 235, in <dictcomp> app.state.MODELS = {model["model"]: model for model in models["models"]} ~~~~~^^^^^^^^^ KeyError: 'model' ```
Author
Owner

@ccarrez commented on GitHub (Sep 1, 2024):

This is working fine with docker compose
See following docker-compose.yml example:

#Ollama
services:
  webui:
    image: ghcr.io/open-webui/open-webui:main
    ports:
     - 3000:8080/tcp
    environment:
     - OLLAMA_BASE_URL=http://ollama:11434
    volumes:
      - /var/opt/data/ollama/webui:/app/backend/data
    depends_on:
     - ollama

  ollama:
    image: ollama/ollama:latest
    ports:
     - 11434:11434/tcp
    volumes:
      - /var/opt/data/ollama/ollama:/root/.ollama
    devices:
     # - /dev/kfd
      - /dev/dri
@ccarrez commented on GitHub (Sep 1, 2024): This is working fine with docker compose See following docker-compose.yml example: ```YAML #Ollama services: webui: image: ghcr.io/open-webui/open-webui:main ports: - 3000:8080/tcp environment: - OLLAMA_BASE_URL=http://ollama:11434 volumes: - /var/opt/data/ollama/webui:/app/backend/data depends_on: - ollama ollama: image: ollama/ollama:latest ports: - 11434:11434/tcp volumes: - /var/opt/data/ollama/ollama:/root/.ollama devices: # - /dev/kfd - /dev/dri ```
Author
Owner

@hemangjoshi37a commented on GitHub (Sep 2, 2024):

ok this is good but why is it natively from the source code not fixed yet after so much long time . so many people are having this problem so long.

@hemangjoshi37a commented on GitHub (Sep 2, 2024): ok this is good but why is it natively from the source code not fixed yet after so much long time . so many people are having this problem so long.
Author
Owner

@atljoseph commented on GitHub (Oct 2, 2024):

I installed this via docker last month. Now, it won’t connect to Ollama. Ollama install hasn’t changed at all. Why is this still a problem? Gonna go spend my time on something else, now. A different repo.

@atljoseph commented on GitHub (Oct 2, 2024): I installed this via docker last month. Now, it won’t connect to Ollama. Ollama install hasn’t changed at all. Why is this still a problem? Gonna go spend my time on something else, now. A different repo.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#148