mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-16 12:02:07 -05:00
Unable to connect to Ollama #148
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @netphantom on GitHub (Jan 4, 2024).
Bug Report
Description
Bug Summary:
I am unable to connect the UI to Ollama server, despite it is running. The UI returns
Ollama Web UI
Ollama WebUI: Server Connection Error
Ollama Version: Not Detected
Connection Issue or Update Needed
Oops! It seems like your Ollama needs a little attention.
We've detected either a connection hiccup or observed that you're using an older version. Ensure you're on the latest Ollama version
(version 0.1.16 or higher) or check your connection.
Trouble accessing Ollama? Click here for help.
Steps to Reproduce:
run on a terminal
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:mainThen register with the email
Expected Behavior:
Login
Actual Behavior:
Connection to Ollama error
Environment
Reproduction Details
Confirmation:
Logs and Screenshots
Docker Container Logs:
Screenshots (if applicable):

Installation Method
Docker (image downloaded)
Additional Information
OLLAMA_API_BASE_URLdoesn't fix the problemhostdoesn't fix the problemhttp://127.0.0.1:11434/and responding to curls as well as ollama run enters in the consoleNote
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
@tjbck commented on GitHub (Jan 4, 2024):
Hi! Just to confirm,
command didn't work for you after removing the old webui docker instance (
docker rm -f ollama-webui)?@netphantom commented on GitHub (Jan 4, 2024):
Hi @tjbck , I just tried the command and the answer is no. I deleted volumes, image, container and everything. After re-signing up, the error persists.
Please find attached the docker logs
@tjbck commented on GitHub (Jan 4, 2024):
Hmm, could you also provide the browser console logs with us? Thanks!
@tjbck commented on GitHub (Jan 4, 2024):
should've been
in the backend logs.
so please make sure that the ollama api url is set to
/ollama/apiin the webui settings as well!@netphantom commented on GitHub (Jan 4, 2024):
I thought for a moment that it worked, but I am getting this error on the backend
On the Firefox Log I see these:
Ollama server is running, as a matter of fact, if I run
@tjbck commented on GitHub (Jan 4, 2024):
Seems like you might've set your
OLLAMA_API_BASE_URLenv var tohttp://127.0.0.1:11434/ollama/api, could you verify that it's been set tohttp://127.0.0.1:11434/api?@tjbck commented on GitHub (Jan 5, 2024):
If you update to the latest release, the settings should look something like this:
@netphantom commented on GitHub (Jan 5, 2024):
I Think I got it working.
I purged docker images and run the new one with:
docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:mainThe new screen actually shows the correct base url:
http://127.0.0.1:11434/apiThanks for the support! :)
@tjbck commented on GitHub (Jan 5, 2024):
Glad it's working for you now!
@maurimv commented on GitHub (May 1, 2024):
Thanks, works for me!
@justinh-rahb commented on GitHub (May 1, 2024):
That's old advice. The Ollama URL variable values should no longer have
/apion the end, this may cause problems.@hemangjoshi37a commented on GitHub (May 6, 2024):
hi working in ollama-webui but no working in open-webui
http://127.0.0.1:11434/api@ahmetcanisik commented on GitHub (Jun 5, 2024):
When I tried http://localhost:8080, webui finally saw my ollama models.
@ccarrez commented on GitHub (Aug 14, 2024):
netphantom found a solution that shows the problem is related to Docker network
It is working by using the host network
However, in this case, you cannot use port redirection -p 3000:8080
Open Web UI is still not working as intended, cf. readme installation guide
@laurentperez commented on GitHub (Aug 30, 2024):
sorry to bump on this closed one but @ccarrez is right. running the container on the host network will prevent port forwarding.
when you aready run local containers on port 8080 this is not possible.
EDIT you need to run open-webui directly without port redirecting :
docker run --network=host -e WEBUI_AUTH=false -e PORT=3000 -e OLLAMA_BASE_URL=http://xxxx@hemangjoshi37a commented on GitHub (Aug 31, 2024):
now it is not even starting and giving this error in the docker container logs. :
@ccarrez commented on GitHub (Sep 1, 2024):
This is working fine with docker compose
See following docker-compose.yml example:
@hemangjoshi37a commented on GitHub (Sep 2, 2024):
ok this is good but why is it natively from the source code not fixed yet after so much long time . so many people are having this problem so long.
@atljoseph commented on GitHub (Oct 2, 2024):
I installed this via docker last month. Now, it won’t connect to Ollama. Ollama install hasn’t changed at all. Why is this still a problem? Gonna go spend my time on something else, now. A different repo.