mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 19:38:46 -05:00
[GH-ISSUE #802] open-webui doesn't detect ollama #12222
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @mira-roza on GitHub (Feb 19, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/802
Bug Report
Description
Bug Summary:
open-webui doesn't detect ollama
Steps to Reproduce:
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:mainExpected Behavior:
to be able to use it
Actual Behavior:
On the webpage i have:
Connection Issue or Update Needed
Oops! It seems like your Ollama needs a little attention.
We've detected either a connection hiccup or observed that you're using an older version. Ensure you're on the latest Ollama version
(version 0.1.16 or higher) or check your connection.
Trouble accessing Ollama? Click here for help.
Environment
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Installation Method
with docker, i tried:
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:mainanddocker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:mainsame result and withdocker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:maini have on the webpage http://localhost:3000:Open WebUI Backend Required
Oops! You're using an unsupported method (frontend only). Please serve the WebUI from the backend.
Additional Information
[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
@justinh-rahb commented on GitHub (Feb 19, 2024):
Read the Ollama instructions for setting environment variables on Linux and then change your
OLLAMA_API_BASE_URLin thedocker runcommand tohost.docker.internal@mira-roza commented on GitHub (Feb 19, 2024):
i still have the problem.
I modified to:
i did
after that:
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:mainand i open http://127.0.0.1:3000/ and i still have the error: Connection Issue or Update Needed
@justinh-rahb commented on GitHub (Feb 19, 2024):
Set the environment variable for the Ollama host in your
docker runcommand:You may also need
Environment="OLLAMA_ORIGINS=*"in your systemd override.@mira-roza commented on GitHub (Feb 19, 2024):
i have the same result. what can help you to know where is the problem?
@justinh-rahb commented on GitHub (Feb 19, 2024):
Try adding this into the
docker runcommand:--add-host=host.docker.internal:host-gatewayAnd be sure you're removing the previous containers you've launched.
If it's still not working, we should probably be certain that your Ollama is reachable:
The result should say "OK". If this can't be done then Ollama is not properly using the environment variable you set.
@mira-roza commented on GitHub (Feb 19, 2024):
i use the docker run that you provided and i set the envvars in /etc/systemd/system/ollama.service.d/override.conf. the envvars taht i set are: OLLAMA_HOST=0.0.0.0 and OLLAMA_ORIGINS=*. my ollama work on the same device as open-webui and it work for exemple if i do curl http://localhost:11434/api/version i have {"version":"0.1.25"}. i always remove the previous as otherwise i can't create a new one. i used : docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e OLLAMA_API_BASE_URL=http://host.docker.internal:11434/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main and it still doesn't work
@justinh-rahb commented on GitHub (Feb 19, 2024):
Let's try an alternative approach from here:
You'll access the WebUI from http://localhost:8080 instead now.
@mira-roza commented on GitHub (Feb 19, 2024):
when i do that i have: "Unable to connect" with that. i think it s because:
@justinh-rahb commented on GitHub (Feb 19, 2024):
When we use
hostnetworking we don't need to open the ports. What about http://127.0.0.1:8080?Is it possible you have something else that already claimed port 8080?
@mira-roza commented on GitHub (Feb 19, 2024):
i dont think and i don't find 8080 when i do
ss. i tried on http://127.0.0.1:8080/ and i have Unable to connect@justinh-rahb commented on GitHub (Feb 19, 2024):
Well, now I'm unsure. Maybe this will need fresh eyes to take a look.