mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #87] Doesn't work behind cloudflare + reverse proxy #11929
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @semioniy on GitHub (Nov 11, 2023).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/87
Describe the bug
When accessed through LAN, webui works properly. When accessing it through cloudflare + reverse proxy, like I access all the applications running on my virtualization NAS, webui seems to be unable to find ollama by its address. (If I had to guess, something in the webui doesn't use the
Ollama Server URLvariable and instead tries to resolve the ollama's address by itself, which fails because it gets my domain instead of localhost.)To Reproduce
Expected behavior
Both instances of webui work the same
Actual behavior
webui over LAN can find the ollama's endpoint, and you can download models / make prompts
webui over domain name can's find the ollama's endpoint, the default value of
Ollama Server URLis http://:11434/api which cannot be reached, values using localhost or 127.0.0.1 don't work either, and you can't download models / select downloaded onesScreenshots





Desktop (please complete the following information):
@tjbck commented on GitHub (Nov 11, 2023):
Hi, it seems like you're experiencing an issue because you're trying to access HTTP over HTTPS, make sure to also put Ollama behind the reverse proxy and it should solve your issue. Keep us updated. Thanks!
@semioniy commented on GitHub (Nov 11, 2023):
Hi, @tjbck. You mean so that webui and ollama would communicate over the internet rather than over LAN? If so - than that's... a very roundabout solution. And one that AFAIK won't work, cause cloudflare returns you this html page if your server doesn't respond to GET requests on port 80 / 443, and ollama's endpoint, even if put behind reverse proxy with mapping 11434:80, won't respond to GET requests.

@tjbck commented on GitHub (Nov 11, 2023):
Hi, The issue is occurring because you're accessing the web ui over https when the ollama server is served with http, and most browsers don't allow you to connect to insecure connection if the web is served over https. What I would suggest in your situation is to have reverse proxy software like nginx and expose 11343 port behind the proxy. Please try to access the webui over http and everything should work fine in your case. Keep us updated. Thanks.
@oliverbob commented on GitHub (Nov 17, 2023):
sudo a2enmod ssl
sudo a2enmod proxy
sudo a2enmod proxy_http
sudo a2enmod proxy_wstunnel # Enable if you need WebSocket support
sudo systemctl restart apache2 # Restart Apache to apply changes
sudo systemctl restart apache2
Then in your local terminal try it with:
You should now be able to connect with your API.
If you have some more questions, you can ask your model.
@tjbck commented on GitHub (Nov 17, 2023):
This issue should've been solved with #111, Keep us updated. Thanks!