issue: OLLAMA_HEADERS environment variable is ignored for Ollama connections through cloudflare trusted proxy #6636

Closed
opened 2025-11-11 17:01:55 -06:00 by GiteaMirror · 5 comments
Owner

Originally created by @LavaTiger99 on GitHub (Oct 9, 2025).

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.6.33

Ollama Version (if applicable)

0.11.4

Operating System

ubuntu 22.04

Browser (if applicable)

Chrome

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

When Open WebUI is configured to connect to a remote Ollama instance protected by an external authentication proxy (like Cloudflare Access) that requires custom headers, the application should connect when the headers are provided. Open WebUI should read the OLLAMA_HEADERS environment variable, parse the JSON, and attach the specified headers to all outgoing API requests to the OLLAMA_BASE_URLS. The connection should succeed, and the models should be listed in the UI.

Actual Behavior

When Open WebUI is configured to connect to a remote Ollama instance protected by an external authentication proxy (like Cloudflare Access) that requires custom headers, the application fails to connect. The OLLAMA_HEADERS environment variable, which should be used to inject these custom headers, appears to be ignored by the application. This results in a 403 Forbidden error from the proxy, as the authentication headers are never sent.
We have confirmed through curl and a custom Python script executed inside the Open WebUI container that the networking, DNS, and authentication credentials are all correct and functional. The failure is isolated to the Open WebUI application's inability to attach the specified headers to its outgoing requests.

Steps to Reproduce

Set up a remote Ollama instance.
Protect the instance with a reverse proxy that requires custom headers for authentication. In our case, this was Cloudflare Tunnel with a Cloudflare Access policy requiring a CF-Access-Client-Id and CF-Access-Client-Secret.
Deploy Open WebUI using the following docker-compose.yml configuration:
services:
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
environment:
OLLAMA_BASE_URLS: 'https://your-ollama-url.com'
OLLAMA_HEADERS: '{"CF-Access-Client-Id": "YOUR_ID.access", "CF-Access-Client-Secret": "YOUR_SECRET"}'
OLLAMA_TLS_VERIFY: 'false'
# ... other configs

Start the container and open the Web UI.
Observe that no models are loaded from the remote Ollama instance.
Check the container logs (docker logs open-webui).

Logs & Screenshots

  1. Open WebUI Log Output (Failure)
    The application logs show a 403 Forbidden error, indicating the proxy denied the request. The error message Attempt to decode JSON with unexpected mimetype: text/html confirms the application received an HTML error page from the proxy instead of a JSON API response.
    ERROR | open_webui.routers.ollama:send_get_request:106 - Connection error: 403, message='Attempt to decode JSON with unexpected mimetype: text/html', url='https://your-ollama-url.com/api/tags'
    ERROR | open_webui.routers.ollama:send_get_request:106 - Connection error: 403, message='Attempt to decode JSON with unexpected mimetype: text/html', url='https://your-ollama-url.com/api/version'

  2. curl Test Inside the Container (Success)
    Running curl with the correct headers from inside the container works perfectly and returns the list of models. This proves the container has correct network access and the credentials are valid.
    docker exec open-webui curl
    -H "CF-Access-Client-Id: YOUR_ID.access"
    -H "CF-Access-Client-Secret: YOUR_SECRET"
    https://your-ollama-url.com/api/tags

Output:

{"models":[{"name":"gemma3:27b", ...}]}

  1. Python Script Test Inside the Container (Success)
    A Python script using the requests library also works perfectly from inside the container, proving the environment is capable of making a successful connection.
test_connection.py
import requests

headers = {
    "CF-Access-Client-Id": "YOUR_ID.access",
    "CF-Access-Client-Secret": "YOUR_SECRET",
}
response = requests.get("[https://your-ollama-url.com/api/tags](https://your-ollama-url.com/api/tags)", headers=headers, verify=False)
print(response.status_code)
print(response.json())

docker exec open-webui python3 /app/test_connection.py

Output:
200
{'models': [{'name': 'gemma3:27b', ...}]}

Additional Information

Troubleshooting Steps Attempted
Switched between OLLAMA_BASE_URL and OLLAMA_BASE_URLS.
Switched between open-webui:main-slim and open-webui:main images.
Completely removed and recreated the persistent data volume (./open-webui) to ensure a fresh configuration.
Confirmed that adding the connection via the UI does not work, as the UI does not support multiple custom headers (only a single Bearer token).

Originally created by @LavaTiger99 on GitHub (Oct 9, 2025). ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.6.33 ### Ollama Version (if applicable) 0.11.4 ### Operating System ubuntu 22.04 ### Browser (if applicable) Chrome ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior When Open WebUI is configured to connect to a remote Ollama instance protected by an external authentication proxy (like Cloudflare Access) that requires custom headers, the application should connect when the headers are provided. Open WebUI should read the OLLAMA_HEADERS environment variable, parse the JSON, and attach the specified headers to all outgoing API requests to the OLLAMA_BASE_URLS. The connection should succeed, and the models should be listed in the UI. ### Actual Behavior When Open WebUI is configured to connect to a remote Ollama instance protected by an external authentication proxy (like Cloudflare Access) that requires custom headers, the application fails to connect. The OLLAMA_HEADERS environment variable, which should be used to inject these custom headers, appears to be ignored by the application. This results in a 403 Forbidden error from the proxy, as the authentication headers are never sent. We have confirmed through curl and a custom Python script executed inside the Open WebUI container that the networking, DNS, and authentication credentials are all correct and functional. The failure is isolated to the Open WebUI application's inability to attach the specified headers to its outgoing requests. ### Steps to Reproduce Set up a remote Ollama instance. Protect the instance with a reverse proxy that requires custom headers for authentication. In our case, this was Cloudflare Tunnel with a Cloudflare Access policy requiring a CF-Access-Client-Id and CF-Access-Client-Secret. Deploy Open WebUI using the following docker-compose.yml configuration: services: open-webui: image: ghcr.io/open-webui/open-webui:main container_name: open-webui environment: OLLAMA_BASE_URLS: '[https://your-ollama-url.com](https://your-ollama-url.com)' OLLAMA_HEADERS: '{"CF-Access-Client-Id": "YOUR_ID.access", "CF-Access-Client-Secret": "YOUR_SECRET"}' OLLAMA_TLS_VERIFY: 'false' # ... other configs Start the container and open the Web UI. Observe that no models are loaded from the remote Ollama instance. Check the container logs (docker logs open-webui). ### Logs & Screenshots 1. Open WebUI Log Output (Failure) The application logs show a 403 Forbidden error, indicating the proxy denied the request. The error message Attempt to decode JSON with unexpected mimetype: text/html confirms the application received an HTML error page from the proxy instead of a JSON API response. ERROR | open_webui.routers.ollama:send_get_request:106 - Connection error: 403, message='Attempt to decode JSON with unexpected mimetype: text/html', url='[https://your-ollama-url.com/api/tags](https://your-ollama-url.com/api/tags)' ERROR | open_webui.routers.ollama:send_get_request:106 - Connection error: 403, message='Attempt to decode JSON with unexpected mimetype: text/html', url='[https://your-ollama-url.com/api/version](https://your-ollama-url.com/api/version)' 2. curl Test Inside the Container (Success) Running curl with the correct headers from inside the container works perfectly and returns the list of models. This proves the container has correct network access and the credentials are valid. docker exec open-webui curl \ -H "CF-Access-Client-Id: YOUR_ID.access" \ -H "CF-Access-Client-Secret: YOUR_SECRET" \ [https://your-ollama-url.com/api/tags](https://your-ollama-url.com/api/tags) Output: > {"models":[{"name":"gemma3:27b", ...}]} 3. Python Script Test Inside the Container (Success) A Python script using the requests library also works perfectly from inside the container, proving the environment is capable of making a successful connection. ``` test_connection.py import requests headers = { "CF-Access-Client-Id": "YOUR_ID.access", "CF-Access-Client-Secret": "YOUR_SECRET", } response = requests.get("[https://your-ollama-url.com/api/tags](https://your-ollama-url.com/api/tags)", headers=headers, verify=False) print(response.status_code) print(response.json()) ``` docker exec open-webui python3 /app/test_connection.py > Output: > 200 > {'models': [{'name': 'gemma3:27b', ...}]} ### Additional Information Troubleshooting Steps Attempted Switched between OLLAMA_BASE_URL and OLLAMA_BASE_URLS. Switched between open-webui:main-slim and open-webui:main images. Completely removed and recreated the persistent data volume (./open-webui) to ensure a fresh configuration. Confirmed that adding the connection via the UI does not work, as the UI does not support multiple custom headers (only a single Bearer token).
GiteaMirror added the bug label 2025-11-11 17:01:55 -06:00
Author
Owner

@tjbck commented on GitHub (Oct 9, 2025):

We don't have OLLAMA_HEADERS env var.

@tjbck commented on GitHub (Oct 9, 2025): We don't have `OLLAMA_HEADERS` env var.
Author
Owner

@sclass commented on GitHub (Oct 21, 2025):

Hello everyone! I’ll be testing Cloudflare Dashboard → Rules → Transform Rules → HTTP Request Header Modification.
No comment from me means everything worked perfectly.

@sclass commented on GitHub (Oct 21, 2025): Hello everyone! I’ll be testing Cloudflare Dashboard → Rules → Transform Rules → HTTP Request Header Modification. No comment from me means everything worked perfectly. ✅
Author
Owner

@sclass commented on GitHub (Oct 21, 2025):

FAILED: “CF-” adding is not allowed at Cloudflare.
I tried adding a header starting with “CF-” in Transform Rules → HTTP Request Header Modification,
but Cloudflare said: “Nope, reserved prefix!”

So, lesson learned: CF- is blocked by Cloudflare.
We need to check this part of the code to resolve the issue.

@sclass commented on GitHub (Oct 21, 2025): ❌ FAILED: “CF-” adding is not allowed at Cloudflare. I tried adding a header starting with “CF-” in Transform Rules → HTTP Request Header Modification, but Cloudflare said: “Nope, reserved prefix!” So, lesson learned: CF- is blocked by Cloudflare. We need to check this part of the code to resolve the issue.
Author
Owner

@LavaTiger99 commented on GitHub (Oct 21, 2025):

I ended up using a nginx reverse proxy to inject the headers before passing to cloudflare. Only way I could get it to work.

@LavaTiger99 commented on GitHub (Oct 21, 2025): I ended up using a nginx reverse proxy to inject the headers before passing to cloudflare. Only way I could get it to work.
Author
Owner

@LavaTiger99 commented on GitHub (Oct 21, 2025):

Here are the files if anyone else is interested

Dockerfile:

FROM nginx:1.25-alpine
# Nginx official image will envsubst any templates in /etc/nginx/templates at startup
COPY default.conf.template /etc/nginx/templates/default.conf.template

default.conf.template (change the port and URL, inject the CF tokens via docker env file):

server {
  listen 11434;

  # DNS for upstream resolution (optional but robust)
  resolver 1.1.1.1 1.0.0.1 valid=300s;
  resolver_timeout 5s;

  location / {
    proxy_pass ${UPSTREAM};
    proxy_http_version 1.1;

    # Ensure TLS SNI and Host header match the upstream
    proxy_ssl_server_name on;
    proxy_set_header Host ollama.myurl.com;

    # Inject Cloudflare Access headers
    proxy_set_header CF-Access-Client-Id ${CF_CLIENT_ID};
    proxy_set_header CF-Access-Client-Secret ${CF_CLIENT_SECRET};

    # Good for streaming responses
    proxy_buffering off;
    proxy_read_timeout 3600s;

    # Common forward headers
    proxy_set_header Connection "";
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header X-Forwarded-Host $host;
  }
}

Then just point OpenWebUI to the docker container (either using host.docker.internal or other networking) on port 11434

@LavaTiger99 commented on GitHub (Oct 21, 2025): Here are the files if anyone else is interested Dockerfile: ``` FROM nginx:1.25-alpine # Nginx official image will envsubst any templates in /etc/nginx/templates at startup COPY default.conf.template /etc/nginx/templates/default.conf.template ``` default.conf.template (change the port and URL, inject the CF tokens via docker env file): ``` server { listen 11434; # DNS for upstream resolution (optional but robust) resolver 1.1.1.1 1.0.0.1 valid=300s; resolver_timeout 5s; location / { proxy_pass ${UPSTREAM}; proxy_http_version 1.1; # Ensure TLS SNI and Host header match the upstream proxy_ssl_server_name on; proxy_set_header Host ollama.myurl.com; # Inject Cloudflare Access headers proxy_set_header CF-Access-Client-Id ${CF_CLIENT_ID}; proxy_set_header CF-Access-Client-Secret ${CF_CLIENT_SECRET}; # Good for streaming responses proxy_buffering off; proxy_read_timeout 3600s; # Common forward headers proxy_set_header Connection ""; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header X-Forwarded-Host $host; } } ``` Then just point OpenWebUI to the docker container (either using host.docker.internal or other networking) on port 11434
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#6636