[GH-ISSUE #802] open-webui doesn't detect ollama #12222

Closed
opened 2026-04-19 19:06:09 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @mira-roza on GitHub (Feb 19, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/802

Bug Report

Description

Bug Summary:
open-webui doesn't detect ollama

Steps to Reproduce:

  • you install ollama and you check that it's running
  • you install open-webui with docker: docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
  • you go on the webpage and you login
  • you see the error

Expected Behavior:
to be able to use it

Actual Behavior:
On the webpage i have:

Connection Issue or Update Needed
Oops! It seems like your Ollama needs a little attention.
We've detected either a connection hiccup or observed that you're using an older version. Ensure you're on the latest Ollama version
(version 0.1.16 or higher) or check your connection.
Trouble accessing Ollama? Click here for help.

Environment

  • Operating System: Ubuntu 22.04.3 LTS
  • Browser (if applicable): Firefox 122.0.1

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I have reviewed the troubleshooting.md document.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:

No WEBUI_SECRET_KEY provided
Generating WEBUI_SECRET_KEY
Loading WEBUI_SECRET_KEY from .webui_secret_key
INFO:     Started server process [1]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
INFO:     172.17.0.1:50848 - "GET / HTTP/1.1" 200 OK
INFO:     172.17.0.1:50848 - "GET /api/v1/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:50848 - "GET /api/v1/auths/ HTTP/1.1" 401 Unauthorized
INFO:     172.17.0.1:50848 - "GET /_app/immutable/nodes/13.2fb87e10.js HTTP/1.1" 304 Not Modified
INFO:     172.17.0.1:50858 - "GET /_app/immutable/assets/13.e43bb62b.css HTTP/1.1" 304 Not Modified
authenticate_user test@test.com
INFO:     172.17.0.1:34636 - "POST /api/v1/auths/signin HTTP/1.1" 200 OK
INFO:     172.17.0.1:34636 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:34636 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:34636 - "GET /api/v1/documents/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:34636 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK
INFO:     172.17.0.1:34636 - "GET /ollama/api/tags HTTP/1.1" 500 Internal Server Error
INFO:     172.17.0.1:34648 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error
https://api.openai.com/v1/models 
INFO:     172.17.0.1:34648 - "GET /openai/api/models HTTP/1.1" 401 Unauthorized
INFO:     172.17.0.1:34648 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:41882 - "GET /api/v1/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:41882 - "GET /api/v1/auths/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:41882 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:41882 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:41882 - "GET /api/v1/documents/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:41882 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK
INFO:     172.17.0.1:41882 - "GET /ollama/api/tags HTTP/1.1" 500 Internal Server Error
INFO:     172.17.0.1:41886 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error
https://api.openai.com/v1/models 
INFO:     172.17.0.1:41886 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO:     172.17.0.1:41882 - "GET /openai/api/models HTTP/1.1" 401 Unauthorized
INFO:     172.17.0.1:59096 - "GET / HTTP/1.1" 200 OK
INFO:     172.17.0.1:59110 - "GET /_app/immutable/chunks/navigation.388414fc.js HTTP/1.1" 200 OK
INFO:     172.17.0.1:59106 - "GET /_app/immutable/chunks/constants.72278eeb.js HTTP/1.1" 200 OK

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

with docker, i tried: docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main and docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main same result and with docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main i have on the webpage http://localhost:3000:

Open WebUI Backend Required
Oops! You're using an unsupported method (frontend only). Please serve the WebUI from the backend.

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @mira-roza on GitHub (Feb 19, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/802 # Bug Report ## Description **Bug Summary:** open-webui doesn't detect ollama **Steps to Reproduce:** - you install ollama and you check that it's running - you install open-webui with docker: `docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main` - you go on the webpage and you login - you see the error **Expected Behavior:** to be able to use it **Actual Behavior:** On the webpage i have: Connection Issue or Update Needed Oops! It seems like your Ollama needs a little attention. We've detected either a connection hiccup or observed that you're using an older version. Ensure you're on the latest Ollama version (version 0.1.16 or higher) or check your connection. Trouble accessing Ollama? [Click here for help.](https://github.com/ollama-webui/ollama-webui#troubleshooting) ## Environment - **Operating System:** Ubuntu 22.04.3 LTS - **Browser (if applicable):** Firefox 122.0.1 ## Reproduction Details **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I have reviewed the troubleshooting.md document. - [ ] I have included the browser console logs. - [x] I have included the Docker container logs. ## Logs and Screenshots **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** ``` No WEBUI_SECRET_KEY provided Generating WEBUI_SECRET_KEY Loading WEBUI_SECRET_KEY from .webui_secret_key INFO: Started server process [1] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit) INFO: 172.17.0.1:50848 - "GET / HTTP/1.1" 200 OK INFO: 172.17.0.1:50848 - "GET /api/v1/ HTTP/1.1" 200 OK INFO: 172.17.0.1:50848 - "GET /api/v1/auths/ HTTP/1.1" 401 Unauthorized INFO: 172.17.0.1:50848 - "GET /_app/immutable/nodes/13.2fb87e10.js HTTP/1.1" 304 Not Modified INFO: 172.17.0.1:50858 - "GET /_app/immutable/assets/13.e43bb62b.css HTTP/1.1" 304 Not Modified authenticate_user test@test.com INFO: 172.17.0.1:34636 - "POST /api/v1/auths/signin HTTP/1.1" 200 OK INFO: 172.17.0.1:34636 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK INFO: 172.17.0.1:34636 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK INFO: 172.17.0.1:34636 - "GET /api/v1/documents/ HTTP/1.1" 200 OK INFO: 172.17.0.1:34636 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK INFO: 172.17.0.1:34636 - "GET /ollama/api/tags HTTP/1.1" 500 Internal Server Error INFO: 172.17.0.1:34648 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error https://api.openai.com/v1/models INFO: 172.17.0.1:34648 - "GET /openai/api/models HTTP/1.1" 401 Unauthorized INFO: 172.17.0.1:34648 - "GET /api/v1/chats/ HTTP/1.1" 200 OK INFO: 172.17.0.1:41882 - "GET /api/v1/ HTTP/1.1" 200 OK INFO: 172.17.0.1:41882 - "GET /api/v1/auths/ HTTP/1.1" 200 OK INFO: 172.17.0.1:41882 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK INFO: 172.17.0.1:41882 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK INFO: 172.17.0.1:41882 - "GET /api/v1/documents/ HTTP/1.1" 200 OK INFO: 172.17.0.1:41882 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK INFO: 172.17.0.1:41882 - "GET /ollama/api/tags HTTP/1.1" 500 Internal Server Error INFO: 172.17.0.1:41886 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error https://api.openai.com/v1/models INFO: 172.17.0.1:41886 - "GET /api/v1/chats/ HTTP/1.1" 200 OK INFO: 172.17.0.1:41882 - "GET /openai/api/models HTTP/1.1" 401 Unauthorized INFO: 172.17.0.1:59096 - "GET / HTTP/1.1" 200 OK INFO: 172.17.0.1:59110 - "GET /_app/immutable/chunks/navigation.388414fc.js HTTP/1.1" 200 OK INFO: 172.17.0.1:59106 - "GET /_app/immutable/chunks/constants.72278eeb.js HTTP/1.1" 200 OK ``` **Screenshots (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Installation Method with docker, i tried: `docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main` and `docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main` same result and with `docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main` i have on the webpage http://localhost:3000: Open WebUI Backend Required Oops! You're using an unsupported method (frontend only). Please serve the WebUI from the backend. ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@justinh-rahb commented on GitHub (Feb 19, 2024):

Read the Ollama instructions for setting environment variables on Linux and then change your OLLAMA_API_BASE_URL in the docker run command to host.docker.internal

<!-- gh-comment-id:1952466291 --> @justinh-rahb commented on GitHub (Feb 19, 2024): Read the Ollama instructions for [setting environment variables](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux) on Linux and then change your `OLLAMA_API_BASE_URL` in the `docker run` command to `host.docker.internal`
Author
Owner

@mira-roza commented on GitHub (Feb 19, 2024):

i still have the problem.
I modified to:

### Editing /etc/systemd/system/ollama.service.d/override.conf
### Anything between here and the comment below will become the new contents of the file

[Service]
Environment="OLLAMA_HOST=0.0.0.0"

### Lines below this comment will be discarded

### /etc/systemd/system/ollama.service
# [Unit]
# Description=Ollama Service
# After=network-online.target
# 
# [Service]
# ExecStart=/usr/local/bin/ollama serve
# User=ollama
# Group=ollama
# Restart=always
# RestartSec=3
# Environment="PATH=/usr/local/cuda-12.3/bin:/home/poclain/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/b>
# 
# [Install]
# WantedBy=default.target

i did

systemctl daemon-reload
systemctl restart ollama

after that: docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
and i open http://127.0.0.1:3000/ and i still have the error: Connection Issue or Update Needed

<!-- gh-comment-id:1952611174 --> @mira-roza commented on GitHub (Feb 19, 2024): i still have the problem. I modified to: ``` ### Editing /etc/systemd/system/ollama.service.d/override.conf ### Anything between here and the comment below will become the new contents of the file [Service] Environment="OLLAMA_HOST=0.0.0.0" ### Lines below this comment will be discarded ### /etc/systemd/system/ollama.service # [Unit] # Description=Ollama Service # After=network-online.target # # [Service] # ExecStart=/usr/local/bin/ollama serve # User=ollama # Group=ollama # Restart=always # RestartSec=3 # Environment="PATH=/usr/local/cuda-12.3/bin:/home/poclain/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/b> # # [Install] # WantedBy=default.target ``` i did ``` systemctl daemon-reload systemctl restart ollama ``` after that: `docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main` and i open http://127.0.0.1:3000/ and i still have the error: Connection Issue or Update Needed
Author
Owner

@justinh-rahb commented on GitHub (Feb 19, 2024):

Set the environment variable for the Ollama host in your docker run command:

docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://host.docker.internal:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

You may also need Environment="OLLAMA_ORIGINS=*" in your systemd override.

<!-- gh-comment-id:1952621006 --> @justinh-rahb commented on GitHub (Feb 19, 2024): Set the environment variable for the Ollama host in your `docker run` command: ```bash docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://host.docker.internal:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` You may also need `Environment="OLLAMA_ORIGINS=*"` in your systemd override.
Author
Owner

@mira-roza commented on GitHub (Feb 19, 2024):

i have the same result. what can help you to know where is the problem?

<!-- gh-comment-id:1952629475 --> @mira-roza commented on GitHub (Feb 19, 2024): i have the same result. what can help you to know where is the problem?
Author
Owner

@justinh-rahb commented on GitHub (Feb 19, 2024):

i have the same result. what can help you to know where is the problem?

Try adding this into the docker run command:

--add-host=host.docker.internal:host-gateway

And be sure you're removing the previous containers you've launched.

If it's still not working, we should probably be certain that your Ollama is reachable:

curl http://your_ip_address_not_localhost:11434/api

The result should say "OK". If this can't be done then Ollama is not properly using the environment variable you set.

<!-- gh-comment-id:1952636563 --> @justinh-rahb commented on GitHub (Feb 19, 2024): > i have the same result. what can help you to know where is the problem? Try adding this into the `docker run` command: `--add-host=host.docker.internal:host-gateway` And be sure you're removing the previous containers you've launched. If it's still not working, we should probably be certain that your Ollama is reachable: ```bash curl http://your_ip_address_not_localhost:11434/api ``` The result should say "OK". If this can't be done then Ollama is not properly using the environment variable you set.
Author
Owner

@mira-roza commented on GitHub (Feb 19, 2024):

i use the docker run that you provided and i set the envvars in /etc/systemd/system/ollama.service.d/override.conf. the envvars taht i set are: OLLAMA_HOST=0.0.0.0 and OLLAMA_ORIGINS=*. my ollama work on the same device as open-webui and it work for exemple if i do curl http://localhost:11434/api/version i have {"version":"0.1.25"}. i always remove the previous as otherwise i can't create a new one. i used : docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e OLLAMA_API_BASE_URL=http://host.docker.internal:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main and it still doesn't work

<!-- gh-comment-id:1952651087 --> @mira-roza commented on GitHub (Feb 19, 2024): i use the docker run that you provided and i set the envvars in /etc/systemd/system/ollama.service.d/override.conf. the envvars taht i set are: OLLAMA_HOST=0.0.0.0 and OLLAMA_ORIGINS=*. my ollama work on the same device as open-webui and it work for exemple if i do curl http://localhost:11434/api/version i have {"version":"0.1.25"}. i always remove the previous as otherwise i can't create a new one. i used : docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e OLLAMA_API_BASE_URL=http://host.docker.internal:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main and it still doesn't work
Author
Owner

@justinh-rahb commented on GitHub (Feb 19, 2024):

Let's try an alternative approach from here:

docker run -d --network=host -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

You'll access the WebUI from http://localhost:8080 instead now.

<!-- gh-comment-id:1952661796 --> @justinh-rahb commented on GitHub (Feb 19, 2024): Let's try an alternative approach [from here](https://github.com/open-webui/open-webui/blob/main/TROUBLESHOOTING.md#open-webui-server-connection-error): ```bash docker run -d --network=host -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` You'll access the WebUI from http://localhost:8080 instead now.
Author
Owner

@mira-roza commented on GitHub (Feb 19, 2024):

when i do that i have: "Unable to connect" with that. i think it s because:

docker container ls --format "table {{.ID}}\t{{.Names}}\t{{.Ports}}" -a
CONTAINER ID   NAMES        PORTS
cd363a564549   open-webui 
<!-- gh-comment-id:1952837013 --> @mira-roza commented on GitHub (Feb 19, 2024): when i do that i have: "Unable to connect" with that. i think it s because: ``` docker container ls --format "table {{.ID}}\t{{.Names}}\t{{.Ports}}" -a CONTAINER ID NAMES PORTS cd363a564549 open-webui ```
Author
Owner

@justinh-rahb commented on GitHub (Feb 19, 2024):

When we use host networking we don't need to open the ports. What about http://127.0.0.1:8080?

Is it possible you have something else that already claimed port 8080?

<!-- gh-comment-id:1952846319 --> @justinh-rahb commented on GitHub (Feb 19, 2024): When we use `host` networking we don't need to open the ports. What about http://127.0.0.1:8080? Is it possible you have something else that already claimed port 8080?
Author
Owner

@mira-roza commented on GitHub (Feb 19, 2024):

i dont think and i don't find 8080 when i do ss. i tried on http://127.0.0.1:8080/ and i have Unable to connect

<!-- gh-comment-id:1952853436 --> @mira-roza commented on GitHub (Feb 19, 2024): i dont think and i don't find 8080 when i do `ss`. i tried on http://127.0.0.1:8080/ and i have Unable to connect
Author
Owner

@justinh-rahb commented on GitHub (Feb 19, 2024):

Well, now I'm unsure. Maybe this will need fresh eyes to take a look.

<!-- gh-comment-id:1952979241 --> @justinh-rahb commented on GitHub (Feb 19, 2024): Well, now I'm unsure. Maybe this will need fresh eyes to take a look.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#12222