rag not working when hosted on a nginx server #442

Closed
opened 2025-11-11 14:21:29 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @britster03 on GitHub (Mar 8, 2024).

Screenshot (239)

Bug Report

Description

Bug Summary:
created an nginx configuratoin file on the server, included the following code in it :
` server {
listen 80;
server_name myapp.example.com; # Change this to your domain

location / {
    proxy_pass http://localhost:8080; # Forward requests to the Docker app
    proxy_http_version 1.1;
    proxy_set_header Upgrade $http_upgrade;
    proxy_set_header Connection 'upgrade';
    proxy_set_header Host $host;
    proxy_cache_bypass $http_upgrade;
}

}
`
so that i could run that https://xyz.in url. after that installed the open-webui using the docker command as mentioned below. the open-webui succesfully started to work, the normal chat feature is working perfectly, but when uploaded a file like pdf for rag then it is giving a ' POST https://xyz.in/ollama/api/chat 504/524 Gateway timed out error ' , an image of the console log has been attached below. there is cloudflare enabled but all the ports are accessible and i have used ufw for that.
the rag is not working at all. it is not able to contact the /ollama/api
my ollama backend is running on http://localhost:11434. i am able to fetch all the models too.

Steps to Reproduce:
create a nginx server config file , include the above given code in it
use the docker command to run the open-webui on http://localhost:8080
fetch the models from ollama backend
then try the normal chat with any model, and then try doing a pdf chat, it will show up the error.

Expected Behavior:
it is expected to provide the response for the pdf chat.

Actual Behavior:
giving no response and a gateway timed out error

Environment

  • Operating System: [e.g. Ubuntu 20.04]
  • Browser (if applicable): [e.g., Chrome 100.0, Firefox 98.0]

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
Please refer the attached image

Docker Container Logs:
Please refer the attached image
WhatsApp Image 2024-03-08 at 2 18 13 PM

Screenshots (if applicable):
Attached the screenshot

Installation Method

docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Note

followed all the steps and referred to a lot of issues also like #89 #802 #803

Originally created by @britster03 on GitHub (Mar 8, 2024). ![Screenshot (239)](https://github.com/open-webui/open-webui/assets/88887459/845c3a46-f27f-435e-8b45-f9ca47e0c498) # Bug Report ## Description **Bug Summary:** created an nginx configuratoin file on the server, included the following code in it : ` server { listen 80; server_name myapp.example.com; # Change this to your domain location / { proxy_pass http://localhost:8080; # Forward requests to the Docker app proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection 'upgrade'; proxy_set_header Host $host; proxy_cache_bypass $http_upgrade; } } ` so that i could run that https://xyz.in url. after that installed the open-webui using the docker command as mentioned below. the open-webui succesfully started to work, the normal chat feature is working perfectly, but when uploaded a file like pdf for rag then it is giving a ' POST https://xyz.in/ollama/api/chat 504/524 Gateway timed out error ' , an image of the console log has been attached below. there is cloudflare enabled but all the ports are accessible and i have used ufw for that. the rag is not working at all. it is not able to contact the /ollama/api my ollama backend is running on http://localhost:11434. i am able to fetch all the models too. **Steps to Reproduce:** create a nginx server config file , include the above given code in it use the docker command to run the open-webui on http://localhost:8080 fetch the models from ollama backend then try the normal chat with any model, and then try doing a pdf chat, it will show up the error. **Expected Behavior:** it is expected to provide the response for the pdf chat. **Actual Behavior:** giving no response and a gateway timed out error ## Environment - **Operating System:** [e.g. Ubuntu 20.04] - **Browser (if applicable):** [e.g., Chrome 100.0, Firefox 98.0] ## Reproduction Details **Confirmation:** - [ ] I have read and followed all the instructions provided in the README.md. - [ ] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. ## Logs and Screenshots **Browser Console Logs:** Please refer the attached image **Docker Container Logs:** Please refer the attached image ![WhatsApp Image 2024-03-08 at 2 18 13 PM](https://github.com/open-webui/open-webui/assets/88887459/c79cd94c-7bb3-4076-9d3a-7aec9b83e722) **Screenshots (if applicable):** Attached the screenshot ## Installation Method docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main ## Note followed all the steps and referred to a lot of issues also like #89 #802 #803
Author
Owner

@tjbck commented on GitHub (Mar 8, 2024):

On v0.1.110 right?

@tjbck commented on GitHub (Mar 8, 2024): On v0.1.110 right?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#442