mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-25 04:24:30 -05:00
docker compose webui connection issue #15
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @chymian on GitHub (Oct 30, 2023).
Originally assigned to: @tjbck on GitHub.
Describe the bug
webui has connection pbls., not showing models when ollama server runs on docker.
true for running webui on docker or cli.
To Reproduce
Steps to reproduce the behavior:
docker-compose.yaml
Expected behavior
webui connects to ollama-api via internal docker routing.
Server:
Additional context
via setup & build I have permutated many potenial URLs.
all (localhost, 0.0.0.0, VPN-IP) fail to connect-test, except using the LAN-IP.
but still shows no models.
ollama listworks normal.curl from another host via VPN also works.
in the browser console log:
IP: 10.11.1.x is VPN
IP: 192.168.178.x is LAN
@tjbck commented on GitHub (Oct 30, 2023):
Hi, looks like you're facing a CORS error, could you docker exec to the Ollama container and make sure your environment variable has been registered? Additionally, if both Ollama and Ollama WebUI is running on the same machine, you don't have to add OLLAMA_API_BASE_URL build argument, no need to provide extra_hosts either. Keep us updated. Thanks.
@chymian commented on GitHub (Oct 30, 2023):
you mean whether they are in the env the within the container?
yes they are.
changed the compose file back to no buid-args & extra hosts with no change.
when I understand it rigth if it's ollama's CORS blocking, then I should not be able to curl from same/another host?
and then I don't understand, why its not throwing the CORS error when using the LAN-IP?
@tjbck commented on GitHub (Oct 30, 2023):
CORS error only occurs in a browser environment, therefore would not affect api calls using curl.
Also, for the second log you provided doesn't seem to include '/api' (should be 'http://192.168.178.17:11434/api/tags') at the end, which would cause the connection problem, not CORS error.
If the environment variables were in fact correctly registered with the Ollama container, there shouldn't be an issue. Could you try running both Ollama and Ollama WebUI containers separately and see if that fixes your issue?
@chymian commented on GitHub (Oct 30, 2023):
I did run them seperatey again, with no change.
I've played arround with CORS a bit:
Server (gulag):
Laptop/browser
only via VPN 10.11.1.11
when I add:
OLLAMA_ORIGINS="http://gulag:*,http://10.11.1.17:*,http://10.11.1.11:*"and empty/reset the field in the settings to http://gulag:11434/api,
get docker-logged:
ollama-api | [GIN] 2023/10/30 - 18:05:24 | 403 | 7.953µs | 10.11.1.11 | OPTIONS "/api/tags"browser console:
Access to fetch at 'http://gulag:11434/api/tags' from origin 'http://gulag:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled
so in opposite to the LAN-IP which has no connection (not routed from/to laptop), using the VPN-adr. at least reaches ollama.
but explicite and implizite CORS allowance do not work.
looking up the error msg. seems that ollama does not responde with the right header?
ollama is answering, if I directly browse 'http://gulag:11434/api/tags' from my laptop.
@chymian commented on GitHub (Oct 30, 2023):
in addition, there is a difference in the logged request:
via browser a GET is requested and answered
via webui a OPTIONS is requested and blocked by CORS
dunno whether that helps.
@tjbck commented on GitHub (Oct 30, 2023):
If you're directly browsing to the Ollama server, it will not cause a CORS error because it would be from the same origin.
More info on CORS error here
The problem is likely caused by the VPN setup you have, as the WebUI for most people, don't seem to be experiencing your issue.
Please try running Ollama as instructed here from your localhost:
If you're both running Ollama and Ollama WebUI from your localhost, It's guaranteed to work.
@chymian commented on GitHub (Oct 30, 2023):
from the link you provided
I think there is a missunderstanding:
so why would be my laptop
same originwith the server, when they only have the protocol in common?and then why the CORS ERROR, only via webui, and not via browsing directly the ollama-api.
that was where I was starting with no success, so I wrote the compose file.…
it's late in the night here, I waill tackle that tomorow again.
thank you very much for your help so far, @tjbck
@tjbck commented on GitHub (Oct 30, 2023):
Good news! Just added compose.yaml file to the repo and it seems to work!
Just replaced
to this:
@chymian commented on GitHub (Oct 31, 2023):
I can confirm that it's working.
thank you very much.
I just loaded coodbooga and it was producing garbage - endless. is there a way to stop it, without restarting the container?
@tjbck commented on GitHub (Nov 2, 2023):
Stop generation button has been Implemented with #48. Thanks.