[GH-ISSUE #5811] Ollama behind reverse proxy returns 404 #3622

Closed
opened 2026-04-12 14:23:21 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @rwjack on GitHub (Jul 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5811

What is the issue?

I have a complex proxy setup, but I can't seem to get Ollama set up.

user -> proxy -> ingress-proxy -> docker

services:
  ollama:
    image: ollama/ollama:latest
    ...
    ports:
      - "127.0.0.1:15007:11434"

When I curl 127.0.0.1:15007 from the docker host, I get the proper response Ollama is running#.

When I curl https://docker-host-fqdn:15443 -H "Host: ollama.hq.arpa" from the docker host (the ingress is on the docker host as well), I get a 404.

This leads me to believe that Ollama isn't accepting the request for some reason. I have 20+ other services which just work and all the proxy/ingress configuration is unified, so I'm unsure what else it can be other than some setting related to Ollama. I tried multiple different values in the OLLAMA_ORIGINS variable, including "*", but no joy.

OS

Linux

GPU

No response

CPU

No response

Ollama version

0.2.7

Originally created by @rwjack on GitHub (Jul 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5811 ### What is the issue? I have a complex proxy setup, but I can't seem to get Ollama set up. user -> proxy -> ingress-proxy -> docker ``` services: ollama: image: ollama/ollama:latest ... ports: - "127.0.0.1:15007:11434" ``` When I `curl 127.0.0.1:15007` from the docker host, I get the proper response `Ollama is running#`. When I `curl https://docker-host-fqdn:15443 -H "Host: ollama.hq.arpa"` from the docker host (the ingress is on the docker host as well), I get a 404. This leads me to believe that Ollama isn't accepting the request for some reason. I have 20+ other services which just work and all the proxy/ingress configuration is unified, so I'm unsure what else it can be other than some setting related to Ollama. I tried multiple different values in the OLLAMA_ORIGINS variable, including "*", but no joy. ### OS Linux ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.2.7
GiteaMirror added the bug label 2026-04-12 14:23:21 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3622