mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-10 15:54:15 -05:00
issue: OLLAMA_HEADERS environment variable is ignored for Ollama connections through cloudflare trusted proxy #6636
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @LavaTiger99 on GitHub (Oct 9, 2025).
Check Existing Issues
Installation Method
Docker
Open WebUI Version
v0.6.33
Ollama Version (if applicable)
0.11.4
Operating System
ubuntu 22.04
Browser (if applicable)
Chrome
Confirmation
README.md.Expected Behavior
When Open WebUI is configured to connect to a remote Ollama instance protected by an external authentication proxy (like Cloudflare Access) that requires custom headers, the application should connect when the headers are provided. Open WebUI should read the OLLAMA_HEADERS environment variable, parse the JSON, and attach the specified headers to all outgoing API requests to the OLLAMA_BASE_URLS. The connection should succeed, and the models should be listed in the UI.
Actual Behavior
When Open WebUI is configured to connect to a remote Ollama instance protected by an external authentication proxy (like Cloudflare Access) that requires custom headers, the application fails to connect. The OLLAMA_HEADERS environment variable, which should be used to inject these custom headers, appears to be ignored by the application. This results in a 403 Forbidden error from the proxy, as the authentication headers are never sent.
We have confirmed through curl and a custom Python script executed inside the Open WebUI container that the networking, DNS, and authentication credentials are all correct and functional. The failure is isolated to the Open WebUI application's inability to attach the specified headers to its outgoing requests.
Steps to Reproduce
Set up a remote Ollama instance.
Protect the instance with a reverse proxy that requires custom headers for authentication. In our case, this was Cloudflare Tunnel with a Cloudflare Access policy requiring a CF-Access-Client-Id and CF-Access-Client-Secret.
Deploy Open WebUI using the following docker-compose.yml configuration:
services:
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
environment:
OLLAMA_BASE_URLS: 'https://your-ollama-url.com'
OLLAMA_HEADERS: '{"CF-Access-Client-Id": "YOUR_ID.access", "CF-Access-Client-Secret": "YOUR_SECRET"}'
OLLAMA_TLS_VERIFY: 'false'
# ... other configs
Start the container and open the Web UI.
Observe that no models are loaded from the remote Ollama instance.
Check the container logs (docker logs open-webui).
Logs & Screenshots
Open WebUI Log Output (Failure)
The application logs show a 403 Forbidden error, indicating the proxy denied the request. The error message Attempt to decode JSON with unexpected mimetype: text/html confirms the application received an HTML error page from the proxy instead of a JSON API response.
ERROR | open_webui.routers.ollama:send_get_request:106 - Connection error: 403, message='Attempt to decode JSON with unexpected mimetype: text/html', url='https://your-ollama-url.com/api/tags'
ERROR | open_webui.routers.ollama:send_get_request:106 - Connection error: 403, message='Attempt to decode JSON with unexpected mimetype: text/html', url='https://your-ollama-url.com/api/version'
curl Test Inside the Container (Success)
Running curl with the correct headers from inside the container works perfectly and returns the list of models. This proves the container has correct network access and the credentials are valid.
docker exec open-webui curl
-H "CF-Access-Client-Id: YOUR_ID.access"
-H "CF-Access-Client-Secret: YOUR_SECRET"
https://your-ollama-url.com/api/tags
Output:
A Python script using the requests library also works perfectly from inside the container, proving the environment is capable of making a successful connection.
docker exec open-webui python3 /app/test_connection.py
Additional Information
Troubleshooting Steps Attempted
Switched between OLLAMA_BASE_URL and OLLAMA_BASE_URLS.
Switched between open-webui:main-slim and open-webui:main images.
Completely removed and recreated the persistent data volume (./open-webui) to ensure a fresh configuration.
Confirmed that adding the connection via the UI does not work, as the UI does not support multiple custom headers (only a single Bearer token).
@tjbck commented on GitHub (Oct 9, 2025):
We don't have
OLLAMA_HEADERSenv var.@sclass commented on GitHub (Oct 21, 2025):
Hello everyone! I’ll be testing Cloudflare Dashboard → Rules → Transform Rules → HTTP Request Header Modification.
No comment from me means everything worked perfectly. ✅
@sclass commented on GitHub (Oct 21, 2025):
❌ FAILED: “CF-” adding is not allowed at Cloudflare.
I tried adding a header starting with “CF-” in Transform Rules → HTTP Request Header Modification,
but Cloudflare said: “Nope, reserved prefix!”
So, lesson learned: CF- is blocked by Cloudflare.
We need to check this part of the code to resolve the issue.
@LavaTiger99 commented on GitHub (Oct 21, 2025):
I ended up using a nginx reverse proxy to inject the headers before passing to cloudflare. Only way I could get it to work.
@LavaTiger99 commented on GitHub (Oct 21, 2025):
Here are the files if anyone else is interested
Dockerfile:
default.conf.template (change the port and URL, inject the CF tokens via docker env file):
Then just point OpenWebUI to the docker container (either using host.docker.internal or other networking) on port 11434