mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
[GH-ISSUE #209] Keep getting server connection failed when starting ollama-webui #27505
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @zono50 on GitHub (Dec 13, 2023).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/209
I run the docker command - docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
and it starts, i can go to localhost:3000 and it pulls up, but as soon as it does, i get a dropdown at the top saying "Server connection failed" and then won't let me do anything.
If I go to localhost:11434 it shows ollama is running, and i can run ollama in the terminal, but can't do anything in ollama-webui.
@tjbck commented on GitHub (Dec 13, 2023):
Hi, Could you please provide us with both browser console logs and docker container logs, we cannot diagnose your issue without them. Thanks.
@zono50 commented on GitHub (Dec 13, 2023):
Failed to load resource: the server responded with a status of 500 (internal server error) syntax error: unexpected token '<', "<!doctype "... is not valid JSON
The command i run to start the program is -
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 203, in _new_conn
sock = connection.create_connection(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection
raise err
File "/usr/local/lib/python3.11/site-packages/urllib3/util/connection.py", line 73, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 790, in urlopen
response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 496, in _make_request
conn.request(
File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 395, in request
self.endheaders()
File "/usr/local/lib/python3.11/http/client.py", line 1281, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/local/lib/python3.11/http/client.py", line 1041, in _send_output
self.send(msg)
File "/usr/local/lib/python3.11/http/client.py", line 979, in send
self.connect()
File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 243, in connect
self.sock = self._new_conn()
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 218, in _new_conn
raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fe3a0425610>: Failed to establish a new connection: [Errno 111] Connection refused
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 486, in send
resp = conn.urlopen(
^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 844, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/urllib3/util/retry.py", line 515, in increment
raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fe3a0425610>: Failed to establish a new connection: [Errno 111] Connection refused'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 1455, in wsgi_app
response = self.full_dispatch_request()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 869, in full_dispatch_request
rv = self.handle_user_exception(e)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/flask_cors/extension.py", line 176, in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 867, in full_dispatch_request
rv = self.dispatch_request()
^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/flask/app.py", line 852, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/backend/apps/ollama/main.py", line 63, in proxy
target_response = requests.request(
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/requests/api.py", line 59, in request
return session.request(method=method, url=url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 519, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fe3a0425610>: Failed to establish a new connection: [Errno 111] Connection refused'))
INFO: 172.17.0.1:56854 - "GET /tags HTTP/1.1" 500 Internal Server Error
@tjbck commented on GitHub (Dec 14, 2023):
It seems like Ollama instance is not reachable from the docker container, Could you tell us more about the system you're running Ollama/Ollama WebUI on, as well as how you installed docker on your machine?
@zono50 commented on GitHub (Dec 14, 2023):
I installed ollama originally, and it worked fine, but then i installed a program called quivr which i think was trying to run on 3000 as well. I ended up removing that program and reinstalling this one. i'm using linux manjaro
@tjbck commented on GitHub (Dec 14, 2023):
Hmm, strange. Could you try to
curlollama from inside the webui container and see if it has access to it? Also, just to make sure both Ollama/Ollama WebUI are running on the same machine, correct? Thanks.@zono50 commented on GitHub (Dec 14, 2023):
yes, if i go to localhost:11434 it says ollama is runninng, and i can run it from my terminal. how do I curl ollama from inside the webui container? I even reinstalled ollama, set open rules in my firewall, and same issue
@tjbck commented on GitHub (Dec 14, 2023):
Refer to here: https://stackoverflow.com/questions/30172605/how-do-i-get-into-a-docker-containers-shell
@ElieFrancis1 commented on GitHub (Dec 14, 2023):
Hello, I think I have the same problem :
In the browser, when I open http://myserver.com:3000, I get the message "Server connection failed".
@Clovis-krz commented on GitHub (Dec 14, 2023):
Same for me, OLLAMA_API_BASE_URL doesnt seem to work when starting docker container. I ended up overwriting the url in the settings on the webui but i have to do that for every computer
@sartilas commented on GitHub (Dec 14, 2023):
same for me and in the UI i have : "Uh-oh! There was an issue connecting to Ollama."
curl ollama from inside the webui container :
"root@41938ad91712:/app/backend# curl http://localhost:11434/
curl: (7) Failed to connect to localhost port 11434: Connection refused"
But in in the host : "Ollama is running"
@zono50 commented on GitHub (Dec 14, 2023):
I can do the docker exec command but when i get into the contain using /bin/bash, it doesn't recognize any programs like curl, pacman, wget, ping, nothing
@zono50 commented on GitHub (Dec 14, 2023):
Got this information from Clovis
version: '3.6'
services:
ollama-webui:
build:
context: .
args:
OLLAMA_API_BASE_URL: '/ollama/api'
dockerfile: Dockerfile
image: ollama-webui:latest
container_name: ollama-webui
environment:
- "OLLAMA_API_BASE_URL=http://localhost:11434/api"
# Uncomment below for WIP: Auth support
# - "WEBUI_AUTH=TRUE"
# - "WEBUI_DB_URL=mongodb://root:example@ollama-webui-db:27017/"
# - "WEBUI_JWT_SECRET_KEY=SECRET_KEY"
restart: unless-stopped
network_mode: "host"
volumes:
ollama: {}
i did notice on this option that the option to download and use model files disappears when you change the port
@tjbck commented on GitHub (Dec 14, 2023):
Hi @efrancis59, it looks like you might've made a mistake setting the
OLLAMA_API_BASE_URLenv var, the url should include/apiat the end. (e.g.OLLAMA_API_BASE_URL=http://myserver.com:11434/api)@tjbck commented on GitHub (Dec 14, 2023):
@Clovis-krz Could you also provide us with which command you used to install, as well as the ollama setup you have? Thanks.
@tjbck commented on GitHub (Dec 14, 2023):
@sartilas, Please refer to here: https://stackoverflow.com/questions/24319662/from-inside-of-a-docker-container-how-do-i-connect-to-the-localhost-of-the-mach
Also, if you could provide us with the setup you have, it would help us tremendously.
@tjbck commented on GitHub (Dec 14, 2023):
Here are some potential solutions/relevant issues I found by googling:
https://stackoverflow.com/questions/75237114/max-retries-exceeded-with-url-failed-to-establish-a-new-connection-errno-111
https://forums.docker.com/t/genai-stack-connection-error-with-host-docker-internal-port-11434/137993/2
https://forums.docker.com/t/connection-refused-on-host-docker-internal/136925/2
https://forums.docker.com/t/host-docker-internal-in-production-environment/137507/3
https://github.com/PrefectHQ/prefect/issues/4963
@zono50 commented on GitHub (Dec 14, 2023):
i previously updated my version of ollama but still didn't fix the issue, so i stopped 11434 process and ollama, removed ollama-webui, and then did git clone, and ran docker compose up -d --build. It fixed my issuue, and am back on port 3000 enjoying the good life
@iamyb commented on GitHub (Dec 15, 2023):
same here. the api request to http://localhost:3000/ollama/api/tags was failed with Bad Request reported. api request http://localhost:3000/api/v1/ is OK.
@iamyb commented on GitHub (Dec 15, 2023):
work out with below.
docker run -d --network=host -e OLLAMA_API_BASE_URL=http://localhost:11434/api --name ollama-webui --restart always ollama-webui@djmaze commented on GitHub (Dec 15, 2023):
AFAICS, setting the
OLLAMA_API_BASE_URLenv var on the prebuilt docker image cannot work because the default url seems baked in to the frontend html during the build process.The way the build currently works here, you have to build your own docker image with
--build-arg OLLAMA_API_BASE_URL=http://your-server.com:11434/apiand then use that.This is a design fault and needs to be fixed.
@tjbck commented on GitHub (Dec 15, 2023):
Hi @djmaze, FYI It's not a design fault and it's working as it should, By registering the
OLLAMA_API_BASE_URLenv var in the docker container, you essentially create a backend reverse proxy link, redirecting hardcoded[your webui url]/ollama/apiroute to[your ollama url]/api. Please refer to here: https://github.com/ollama-webui/ollama-webui#project-components. Thanks.@djmaze commented on GitHub (Dec 16, 2023):
Oh, well, sorry, then I got that wrong.
As I had the same error message in the frontend, here is a snippet of my docker logs with the original setup. I see the following error:
Not sure why this happens. Maybe there is a problem with https urls (my ollama is available via https)?
@tjbck commented on GitHub (Dec 16, 2023):
@djmaze Could you provide us with the commands you used to install? Also, is Ollama running on the same server as Ollama WebUI?
@djmaze commented on GitHub (Dec 17, 2023):
It's different docker containers on the same machine. For Ollama WebUI, I used the original docker image and command from the documentation (replaced my domain with
example.comin the snippets):I just realized the logs now don't show the error anymore, only this:
When calling
https://webui.example.com/api/tagsmanually, the page shows the following error message:And I don't see any access attempts in the Ollama server log, so the proxy connection seems not being established at all.
(Getting the url via
curlfrom within the webui container works, btw.)@tjbck commented on GitHub (Dec 19, 2023):
@djmaze If you can join our Discord server and send me a pm with your ollama url, I'll personally take a look. Thanks.
Please check our TROUBLESHOOTING.md: https://github.com/ollama-webui/ollama-webui/blob/main/TROUBLESHOOTING.md
@m-hoseyny commented on GitHub (Jan 29, 2024):
hello,
I just joined the Discord server and asked about the problem. However, I didn't get any response. How can I solve this problem?
@themw123 commented on GitHub (Feb 8, 2024):
same for me. Docker logs shows:
INFO: 100.79.239.85:1866 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server ErrorAnd in browser console i see its not able to call version and tags endpoint (500)
I am using newest Versions of ollama and ollama-webui
my compose:
edit: Very interesting with docker run its now working:
docker run -d --network=host -v absolute_path_to/ollama/webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:mainedit: Now i am getting
500: Internal Errorwhen clickung on admin panel@welchste commented on GitHub (Mar 8, 2024):
I keep getting this as well. Not using any containers, everything manually installed
@targinosilveira commented on GitHub (Mar 9, 2024):
I am gatting the same problem, ollama and open webui was installed manualy in a VM, when I run directly on ollama or a CLI client I haven't problems, only when I try by Open WebUI
INFO: 127.0.0.1:47270 - "POST /ollama/api/chat HTTP/1.1" 500 Internal Server Error
http://localhost:11434
INFO: 127.0.0.1:47270 - "POST /ollama/api/generate HTTP/1.1" 500 Internal Server Error
@justinh-rahb commented on GitHub (Mar 9, 2024):
@welchste @targinosilveira
If you've installed directly, check your
.envfile against the.env.examplefile and be sure you've updated your variable names.OLLAMA_API_BASE_URL->OLLAMA_BASE_URL@welchste commented on GitHub (Mar 9, 2024):
Hey, I tried that. Then actually went Docker route, with no luck. Then I used the local host option with docker and still no luck. I then installed Ollama on a remote server, with no luck.
Finally I tried a different Front End project, and that works. So at least I ruled out any firewall or Ollama specific issues.
This other project is a place holder so look forward to being able to use Open-WebUI in the future.
@justinh-rahb commented on GitHub (Mar 9, 2024):
Mind me asking, what had you set your
OLLAMA_BASE_URLto through these attempts?@welchste commented on GitHub (Mar 12, 2024):
Hey @justinh-rahb wanted to close this loop. It was user error, where I kept adding the port and
/apiURI was in the UI. The base ollama URL I was not including the port. Tried again today putOLLAMA_BASE_URL=IP.V.4.Adress:11434and that did the trick@ilhooq commented on GitHub (May 2, 2024):
I am facing the same problem. By default, Ollama only listens to the local IP address 127.0.0.1, which means that the network inside the Docker container cannot bind to the host port 11434.
To resolve this issue, you can follow these steps:
/etc/systemd/system/ollama.service.Environment="OLLAMA_HOST=0.0.0.0".sudo systemctl daemon-reloadsudo systemctl restart ollama.serviceThis configuration change will allow Ollama to listen on all available network interfaces, including the external IP address (0.0.0.0).
You can find more detailed information this issue discussion: https://github.com/ollama/ollama/issues/2603.
@justinh-rahb commented on GitHub (May 2, 2024):
It is NOT recommended to edit
.servicefiles directly, you should use a local override by doingsystemctl edit ollama.serviceinstead.@vanillagreencom commented on GitHub (Jun 5, 2024):
For me and anyone else experiencing the 500 internal server error on POST to ollama - it was because I had my context window settings too high, since I was using a higher one for openai.
Maybe there should be separate settings like this depending on what model you are using?
@ayang commented on GitHub (Jun 14, 2024):
OLLAMA_HOST=0.0.0.0:11434 ollama serve
this works!
@milkevolii commented on GitHub (Jul 2, 2024):
Admin Panel -> Connections -> Changing Ollama API from http://localhost:11434 to http://host.docker.internal:11434 worked for me.
@robin536180 commented on GitHub (Jul 5, 2024):
That's the final answer, and it works!
@Bodo-von-Greif commented on GitHub (Aug 30, 2024):
My installation which worked without problems broke when i switched to systemd installation of Ollama.
This hint saved my configuration and BTW i also included HTTP_PROXY and HTTPS_PROXY and NO_PROXY.
Thanks a lot!
@justinh-rahb commented on GitHub (Aug 30, 2024):
@Bodo-von-Greif Setting
OLLAMA_HOSTfor systemd is covered by their FAQ:https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux
@ipipe-hellis commented on GitHub (Oct 26, 2024):
This also worked for me.
@prachetas commented on GitHub (Nov 8, 2024):
If you have installed ollama on windows then:
open command prompt and write: ipconfig
then take the IPV4 address and add it to Admin Panel -> Connections -> e.g. http://192.XXX.XXX.XXX:11434
@ibinasaker commented on GitHub (Nov 24, 2024):
For me the flag "--add-host=host.docker.internal:host-gateway" was the answer, note that this will let the container gateway to the host network. a bit risky in enterprise/prod environments.
whole command for me was:
sudo docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
@mjhumphrey4 commented on GitHub (Nov 30, 2024):
I used the following fix:
127.0.0.1. This is becausehost.docker.internal:host-gatewaywill use the public facing interface on your host. Allowing Ollama to accept connections from all interfaces enables Docker to access Ollama on your host. As a result, if you try accessingcurl host.docker.internal:11434/apiin the Open WebUI Docker container environment, you will not get a result, but after allow Ollama to listen for connections on all interfaces, you will receiveOllama is running. Below are some test logsRefresh your Open WebUI interface and verify that you can access Ollama well. You can safely deploy Open WebUI using any of the official installation commands such as:
sudo docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main