[GH-ISSUE #6712] 400 Bad Request when running behind Nginx Proxy Manager #4228

Closed
opened 2026-04-12 15:09:45 -05:00 by GiteaMirror · 14 comments
Owner

Originally created by @Joly0 on GitHub (Sep 9, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6712

What is the issue?

Hey guys, i have an ollama instance that i would like to make public (of course with basic auth) through nginx proxy manager, but whenever i try to reach the api even with a simple request like Invoke-RestMethod -Method Get -Uri https://ollama.mydoamin.com/api/tags i get the error Invoke-RestMethod: 400 Bad Request

OS

Docker

GPU

Nvidia

CPU

AMD

Ollama version

0.3.9

Originally created by @Joly0 on GitHub (Sep 9, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6712 ### What is the issue? Hey guys, i have an ollama instance that i would like to make public (of course with basic auth) through nginx proxy manager, but whenever i try to reach the api even with a simple request like `Invoke-RestMethod -Method Get -Uri https://ollama.mydoamin.com/api/tags` i get the error `Invoke-RestMethod: 400 Bad Request` ### OS Docker ### GPU Nvidia ### CPU AMD ### Ollama version 0.3.9
GiteaMirror added the bug label 2026-04-12 15:09:45 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 9, 2024):

Server logs may help in debugging. Also logs and configuration from the nginx server.

<!-- gh-comment-id:2338380349 --> @rick-github commented on GitHub (Sep 9, 2024): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) may help in debugging. Also logs and configuration from the nginx server.
Author
Owner

@Joly0 commented on GitHub (Sep 9, 2024):

Server logs may help in debugging. Also logs and configuration from the nginx server.

Sure, no problem.

Server Logs:

2024/09/09 14:42:38 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/root/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[* http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-09-09T14:42:38.185Z level=INFO source=images.go:753 msg="total blobs: 84"
time=2024-09-09T14:42:38.187Z level=INFO source=images.go:760 msg="total unused blobs removed: 0"
time=2024-09-09T14:42:38.189Z level=INFO source=routes.go:1172 msg="Listening on [::]:11434 (version 0.3.9)"
time=2024-09-09T14:42:38.190Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama1022117231/runners
time=2024-09-09T14:42:45.253Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu_avx2 cuda_v11 cuda_v12 rocm_v60102 cpu cpu_avx]"
time=2024-09-09T14:42:45.253Z level=INFO source=gpu.go:200 msg="looking for compatible GPUs"
time=2024-09-09T14:42:45.447Z level=WARN source=amd_linux.go:59 msg="ollama recommends running the https://www.amd.com/en/support/linux-drivers" error="amdgpu version file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory"
time=2024-09-09T14:42:45.448Z level=WARN source=amd_linux.go:201 msg="amdgpu too old gfx000" gpu=0
time=2024-09-09T14:42:45.448Z level=INFO source=amd_linux.go:360 msg="no compatible amdgpu devices detected"
time=2024-09-09T14:42:45.448Z level=INFO source=types.go:107 msg="inference compute" id=GPU-ID library=cuda variant=v12 compute=8.9 driver=12.6 name="NVIDIA GeForce RTX 4060 Ti" total="15.6 GiB" available="15.4 GiB"
[GIN] 2024/09/09 - 14:45:18 | 200 |     644.411µs |       127.0.0.1 | GET      "/api/version"

Nginx Access Log when accessing the api using the command from my initial post:
[09/Sep/2024:17:11:15 +0200] ollama.mydomain.com 192.168.178.174 0.000 "GET /api/tags HTTP/1.1" 400 25 213 - Mozilla/5.0 (Windows NT 10.0; Microsoft Windows 10.0.22631; de-DE) PowerShell/7.4.5

Nginx config for that location (/api/tags) i am using for testing:

location /api/tags {
# Base Header Settings
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Real-IP $remote_addr;

# Proxy Header Settings
proxy_set_header Early-Data $ssl_early_data;
proxy_set_header Proxy "";
proxy_set_header Upgrade $http_upgrade;
proxy_set_header X-Forwarded-Host $host;
proxy_set_header X-Forwarded-Ssl on;


proxy_buffering off;
proxy_set_header Origin '';
proxy_set_header Referer '';
    
set $forward_path  "";
    
  
proxy_set_header Upgrade    $http_upgrade;
proxy_set_header Connection $connection_upgrade;
  

include conf.d/include/proxy-location.conf;
proxy_set_header X-Forwarded-Host $host/api/tags;
if ($forward_path = "") {
    rewrite ^/api/tags(/.*)$ $1 break;
}
proxy_pass http://192.168.178.220:11434;
}
<!-- gh-comment-id:2338401544 --> @Joly0 commented on GitHub (Sep 9, 2024): > [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) may help in debugging. Also logs and configuration from the nginx server. Sure, no problem. Server Logs: ``` 2024/09/09 14:42:38 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/root/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[* http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]" time=2024-09-09T14:42:38.185Z level=INFO source=images.go:753 msg="total blobs: 84" time=2024-09-09T14:42:38.187Z level=INFO source=images.go:760 msg="total unused blobs removed: 0" time=2024-09-09T14:42:38.189Z level=INFO source=routes.go:1172 msg="Listening on [::]:11434 (version 0.3.9)" time=2024-09-09T14:42:38.190Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama1022117231/runners time=2024-09-09T14:42:45.253Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu_avx2 cuda_v11 cuda_v12 rocm_v60102 cpu cpu_avx]" time=2024-09-09T14:42:45.253Z level=INFO source=gpu.go:200 msg="looking for compatible GPUs" time=2024-09-09T14:42:45.447Z level=WARN source=amd_linux.go:59 msg="ollama recommends running the https://www.amd.com/en/support/linux-drivers" error="amdgpu version file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory" time=2024-09-09T14:42:45.448Z level=WARN source=amd_linux.go:201 msg="amdgpu too old gfx000" gpu=0 time=2024-09-09T14:42:45.448Z level=INFO source=amd_linux.go:360 msg="no compatible amdgpu devices detected" time=2024-09-09T14:42:45.448Z level=INFO source=types.go:107 msg="inference compute" id=GPU-ID library=cuda variant=v12 compute=8.9 driver=12.6 name="NVIDIA GeForce RTX 4060 Ti" total="15.6 GiB" available="15.4 GiB" [GIN] 2024/09/09 - 14:45:18 | 200 | 644.411µs | 127.0.0.1 | GET "/api/version" ``` Nginx Access Log when accessing the api using the command from my initial post: `[09/Sep/2024:17:11:15 +0200] ollama.mydomain.com 192.168.178.174 0.000 "GET /api/tags HTTP/1.1" 400 25 213 - Mozilla/5.0 (Windows NT 10.0; Microsoft Windows 10.0.22631; de-DE) PowerShell/7.4.5` Nginx config for that location (/api/tags) i am using for testing: ``` location /api/tags { # Base Header Settings proxy_set_header Host $host; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header X-Real-IP $remote_addr; # Proxy Header Settings proxy_set_header Early-Data $ssl_early_data; proxy_set_header Proxy ""; proxy_set_header Upgrade $http_upgrade; proxy_set_header X-Forwarded-Host $host; proxy_set_header X-Forwarded-Ssl on; proxy_buffering off; proxy_set_header Origin ''; proxy_set_header Referer ''; set $forward_path ""; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection $connection_upgrade; include conf.d/include/proxy-location.conf; proxy_set_header X-Forwarded-Host $host/api/tags; if ($forward_path = "") { rewrite ^/api/tags(/.*)$ $1 break; } proxy_pass http://192.168.178.220:11434; } ```
Author
Owner

@rick-github commented on GitHub (Sep 9, 2024):

The nginx log line doesn't look like the default, I take it that 192.168.178.174 is the client?

What's the result of docker exec nginx curl -s 192.168.178.220:11434?

What are you trying to achieve with the rewrite?

<!-- gh-comment-id:2338490742 --> @rick-github commented on GitHub (Sep 9, 2024): The nginx log line doesn't look like the default, I take it that 192.168.178.174 is the client? What's the result of `docker exec nginx curl -s 192.168.178.220:11434`? What are you trying to achieve with the `rewrite`?
Author
Owner

@Joly0 commented on GitHub (Sep 9, 2024):

The command returns Ollama is running

The rewrite is not from me, thats pre-added by nginx proxy manager

For better clarity, this is what i can configure in nginx proxy manager to access ollama:
grafik
grafik
Avanced Configuration here has this configured:

# Base Header Settings
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Real-IP $remote_addr;

# Proxy Header Settings
proxy_set_header Early-Data $ssl_early_data;
proxy_set_header Proxy "";
proxy_set_header Upgrade $http_upgrade;
proxy_set_header X-Forwarded-Host $host;
proxy_set_header X-Forwarded-Ssl on;


proxy_buffering off;
proxy_set_header Origin '';
proxy_set_header Referer '';

grafik
grafik

<!-- gh-comment-id:2338519257 --> @Joly0 commented on GitHub (Sep 9, 2024): The command returns `Ollama is running` The rewrite is not from me, thats pre-added by nginx proxy manager For better clarity, this is what i can configure in nginx proxy manager to access ollama: ![grafik](https://github.com/user-attachments/assets/12b51f7e-4caa-4943-a826-a76a4acfdbdc) ![grafik](https://github.com/user-attachments/assets/67b3fbfe-1d7d-489f-b8c4-7b78cf22382e) Avanced Configuration here has this configured: ``` # Base Header Settings proxy_set_header Host $host; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header X-Real-IP $remote_addr; # Proxy Header Settings proxy_set_header Early-Data $ssl_early_data; proxy_set_header Proxy ""; proxy_set_header Upgrade $http_upgrade; proxy_set_header X-Forwarded-Host $host; proxy_set_header X-Forwarded-Ssl on; proxy_buffering off; proxy_set_header Origin ''; proxy_set_header Referer ''; ``` ![grafik](https://github.com/user-attachments/assets/552fd25a-c9cb-4e9c-8193-4523e82ad3b2) ![grafik](https://github.com/user-attachments/assets/377b0189-3102-42e4-88c1-f0c69952c1f7)
Author
Owner

@Joly0 commented on GitHub (Sep 9, 2024):

In this config the port 8483 is for open-webui, the 11434 is normal ollama. I have both exposed under the same domain

<!-- gh-comment-id:2338521152 --> @Joly0 commented on GitHub (Sep 9, 2024): In this config the port 8483 is for open-webui, the 11434 is normal ollama. I have both exposed under the same domain
Author
Owner

@Joly0 commented on GitHub (Sep 9, 2024):

And as a side-note, i have previously used https://github.com/ParisNeo/ollama_proxy_server to access the ollama access through the exact same reverse proxy with the exact same settings (just different port) and that worked no problem, but i dont need ollama_proxy_server anymore, thats why i want to use ollama directly with nginx-proxy-manager but also thats why i know it should work as it worked before, just with ollama_proxy_server between

<!-- gh-comment-id:2338531801 --> @Joly0 commented on GitHub (Sep 9, 2024): And as a side-note, i have previously used https://github.com/ParisNeo/ollama_proxy_server to access the ollama access through the exact same reverse proxy with the exact same settings (just different port) and that worked no problem, but i dont need ollama_proxy_server anymore, thats why i want to use ollama directly with nginx-proxy-manager but also thats why i know it should work as it worked before, just with ollama_proxy_server between
Author
Owner

@rick-github commented on GitHub (Sep 9, 2024):

The server logs don't extend to the time the failure was logged in nginx, do you have that part of the log?

<!-- gh-comment-id:2338544240 --> @rick-github commented on GitHub (Sep 9, 2024): The server logs don't extend to the time the failure was logged in nginx, do you have that part of the log?
Author
Owner

@Joly0 commented on GitHub (Sep 9, 2024):

The server logs don't extend to the time the failure was logged in nginx, do you have that part of the log?

Not sure what exactly you mean? When i request the ollama api through nginx proxy manager i dont get anything in the ollama logs. Thats what i posted is all i get. There is nothing more. I can request the ollama instance using the same command and just replace the domain with the ip:port and that works and shows an entry in the server logs, but not through nginx

<!-- gh-comment-id:2338557043 --> @Joly0 commented on GitHub (Sep 9, 2024): > The server logs don't extend to the time the failure was logged in nginx, do you have that part of the log? Not sure what exactly you mean? When i request the ollama api through nginx proxy manager i dont get anything in the ollama logs. Thats what i posted is all i get. There is nothing more. I can request the ollama instance using the same command and just replace the domain with the ip:port and that works and shows an entry in the server logs, but not through nginx
Author
Owner

@rick-github commented on GitHub (Sep 9, 2024):

That's fine, just wanted to verify that the request is not being forwarded from nginx. Since the curl worked, there's no connectivity issues, it's just an nginx config problem. On the face of it, it looks like it should work, and it did work with ollama_proxy_server in the mix. Where did the nginx container come from?

<!-- gh-comment-id:2338572099 --> @rick-github commented on GitHub (Sep 9, 2024): That's fine, just wanted to verify that the request is not being forwarded from nginx. Since the curl worked, there's no connectivity issues, it's just an nginx config problem. On the face of it, it looks like it should work, and it did work with ollama_proxy_server in the mix. Where did the nginx container come from?
Author
Owner

@Joly0 commented on GitHub (Sep 9, 2024):

The nginx reverse proxy is this one https://github.com/ZoeyVid/NPMplus which is a fork of https://github.com/NginxProxyManager/nginx-proxy-manager which itself is a very popular frontend for plain nginx with additional features. The reverse proxy itself is running in a docker container aswell on the same unraid host as the ollama instance

<!-- gh-comment-id:2338577909 --> @Joly0 commented on GitHub (Sep 9, 2024): The nginx reverse proxy is this one https://github.com/ZoeyVid/NPMplus which is a fork of https://github.com/NginxProxyManager/nginx-proxy-manager which itself is a very popular frontend for plain nginx with additional features. The reverse proxy itself is running in a docker container aswell on the same unraid host as the ollama instance
Author
Owner

@Joly0 commented on GitHub (Sep 9, 2024):

And for additional information:
grafik
The first request is after changing only the port of the /api/tags endpoint to the ollama_proxy_server container that is running on the same host which is configured to forward requests to "http://192.168.178.220:11434".
The second request is going to the ollama instance directly through nginx proxy manager.

<!-- gh-comment-id:2338586687 --> @Joly0 commented on GitHub (Sep 9, 2024): And for additional information: ![grafik](https://github.com/user-attachments/assets/a244664f-bde2-416b-8807-5b8d34f7a132) The first request is after changing only the port of the /api/tags endpoint to the ollama_proxy_server container that is running on the same host which is configured to forward requests to "http://192.168.178.220:11434". The second request is going to the ollama instance directly through nginx proxy manager.
Author
Owner

@Joly0 commented on GitHub (Sep 9, 2024):

Ok, i think i have figured it out.
After alot of trying out, i figured this header caused the issue:
"proxy_set_header Host $host;"
but i have no idea why this would be

<!-- gh-comment-id:2338670191 --> @Joly0 commented on GitHub (Sep 9, 2024): Ok, i think i have figured it out. After alot of trying out, i figured this header caused the issue: "proxy_set_header Host $host;" but i have no idea why this would be
Author
Owner

@kennyparsons commented on GitHub (Sep 26, 2024):

This was my fix for plain nginx reverse proxy. Thanks.

<!-- gh-comment-id:2377790221 --> @kennyparsons commented on GitHub (Sep 26, 2024): This was my fix for plain nginx reverse proxy. Thanks.
Author
Owner

@teddius commented on GitHub (Oct 17, 2024):

Joly0 Can you post your complete nginx config please so we can see the full ollama setup?

<!-- gh-comment-id:2418965080 --> @teddius commented on GitHub (Oct 17, 2024): [Joly0](https://github.com/Joly0) Can you post your complete nginx config please so we can see the full ollama setup?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4228