mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-22 14:13:08 -05:00
rag not working when hosted on a nginx server #442
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @britster03 on GitHub (Mar 8, 2024).
Bug Report
Description
Bug Summary:
created an nginx configuratoin file on the server, included the following code in it :
` server {
listen 80;
server_name myapp.example.com; # Change this to your domain
}
`
so that i could run that https://xyz.in url. after that installed the open-webui using the docker command as mentioned below. the open-webui succesfully started to work, the normal chat feature is working perfectly, but when uploaded a file like pdf for rag then it is giving a ' POST https://xyz.in/ollama/api/chat 504/524 Gateway timed out error ' , an image of the console log has been attached below. there is cloudflare enabled but all the ports are accessible and i have used ufw for that.
the rag is not working at all. it is not able to contact the /ollama/api
my ollama backend is running on http://localhost:11434. i am able to fetch all the models too.
Steps to Reproduce:
create a nginx server config file , include the above given code in it
use the docker command to run the open-webui on http://localhost:8080
fetch the models from ollama backend
then try the normal chat with any model, and then try doing a pdf chat, it will show up the error.
Expected Behavior:
it is expected to provide the response for the pdf chat.
Actual Behavior:
giving no response and a gateway timed out error
Environment
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
Please refer the attached image
Docker Container Logs:

Please refer the attached image
Screenshots (if applicable):
Attached the screenshot
Installation Method
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
Note
followed all the steps and referred to a lot of issues also like #89 #802 #803
@tjbck commented on GitHub (Mar 8, 2024):
On v0.1.110 right?