[GH-ISSUE #16158] issue: Processing does not continue after open_webui.retrieval.utils:generate_openai_batch_embeddings call #56473

Closed
opened 2026-05-05 19:29:18 -05:00 by GiteaMirror · 18 comments
Owner

Originally created by @BAngelis on GitHub (Jul 30, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/16158

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.6.18

Ollama Version (if applicable)

n/a

Operating System

Ubuntu 22.04 LTS

Browser (if applicable)

Chrome 138.0.7204.169 (Official Build) (arm64)

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

After tens of minutes of having successful request responses, once leaving the system idle for 30 minutes and making a request to OpenWeb UI you expect a response back to the browser, but it never comes.

Actual Behavior

After tens of minutes of having successful request/response handled, once leaving the system idle for 30 minutes and making a request to OpenWeb UI does not return a response to the browser, and the browser app eventually times out. Looking at the openwebui log entries, you can see the last log entry is : ai4l-openwebui | 2025-07-30T00:06:14.538244099Z 2025-07-30 00:06:14.537 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 1 - {}

Then, 3 minutes later the openwebui container goes "unhealthy" and is restarted by my watchdog.

Here are the logs leading up to the issue:

Steps to Reproduce

Overall deployment environment:

  • Azure VM running Ubuntu 22.04
  • Docker installed on the VM.
  • nginx configured as a reverse proxy to support https
  • Pinecone as the vector store
  • Set DEBUG
  • OpenAI used for LLM and embeddings model
  • Two Client browsers running on MacOS and IOS

Steps to reproduce

  • Using two clients, one from IOS Chrome and one from MacOS Chrome, I issue several chat requests and receive responses from models configured to use my Knowledge bases in Pinecone.
  • After a variable number (>10) of successful interactions, let the system go idle by not sending any requests for 30+ minutes.
  • Issue a chat request and the request never returns. Eventually the browser displays an error message.
  • Look at the Docker logs (with DEBUG enabled) and see that that request processing started but never logged any more progress after the call to open_webui.retrieval.utils:generate_openai_batch_embeddings
  • After 3 minutes of no logging from Openweb UI, the container is restarted by my watchdog

Here is my docker-compose.yml

Define the logging configuration as an anchor

x-logging: &default-logging
driver: json-file
options:
max-size: "5m"
max-file: "3"
compress: "true"

services:
openwebui:
image: ghcr.io/open-webui/open-webui:latest
container_name: ai4l-openwebui
logging: *default-logging
environment:
- GLOBAL_LOG_LEVEL=DEBUG
- VECTOR_DB=pinecone
- PINECONE_API_KEY=removed
- PINECONE_ENVIRONMENT=eastus2
- PINECONE_INDEX_NAME=ai4l
- PINECONE_DIMENSION=1536
- PINECONE_METRIC=cosine
- PINECONE_CLOUD=azure
- PORT=8080
- OAUTH_ENABLED=false
- WEBUI_ALLOW_REGISTRATION=false
- RAG_EMBEDDING_OPENAI_BATCH_SIZE=16
- RAG_EMBEDDING_BATCH_SIZE=16
volumes:
- /mnt_openwebui_data/data:/app/backend/data
- /mnt_openwebui_data/config:/app/backend/config
# Add net-tools installation
command: >
sh -c "
apt-get update && apt-get install -y net-tools &&
cd /app/backend && exec bash start.sh
"
restart: unless-stopped
networks:
- openwebui-network

nginx:
image: nginx:latest
container_name: openwebui-nginx
logging: *default-logging
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf
- /etc/letsencrypt:/etc/letsencrypt:ro
ports:
- "80:80"
- "443:443"
restart: unless-stopped
networks:
- openwebui-network
depends_on:
- openwebui

WebSocket Monitoring Service

system-monitor:
image: alpine:latest
container_name: websocket-monitor
logging: *default-logging
volumes:
- /var/run/docker.sock:/var/run/docker.sock:ro
- ./app/watchstats.sh:/app/watchstats.sh:ro
- ./monitor_logs:/app/logs
command: >
sh -c "
apk add --no-cache docker &&
cd /app &&
exec sh ./watchstats.sh
"
restart: unless-stopped
networks:
- openwebui-network
depends_on:
- openwebui

networks:
openwebui-network:
driver: bridge

When the issue occurs : Logs show a “hang and then a restart”. Note the 3 minutes that pass before the restart. My watchdog detects the container is “unhealthy” and restarts it.

nd.openxmlformats-officedocument.wordprocessingml.document', 'size': 36166, 'data': {}, 'collection_name': '19c13be5-12f0-41cb-a109-f2826d8f6382'}, 'created_at': 1749841911, 'updated_at': 1749841911}], 'type': 'collection'}]}, 'access_control': None, 'is_active': True, 'updated_at': 1750898365, 'created_at': 1750898365}, 'preset': True, 'actions': [], 'filters': [], 'tags': []}, 'direct': False}} - {}
ai4l-openwebui | 2025-07-30T00:06:14.538244099Z 2025-07-30 00:06:14.537 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 1 - {}
ai4l-openwebui | 2025-07-30T00:09:11.308691394Z Hit:1 http://deb.debian.org/debian bookworm InRelease
ai4l-openwebui | 2025-07-30T00:09:11.308721894Z Hit:2 http://deb.debian.org/debian bookworm-updates InRelease
ai4l-openwebui | 2025-07-30T00:09:11.308726794Z Hit:3 http://deb.debian.org/debian-security bookworm-security InRelease
ai4l-openwebui | 2025-07-30T00:09:12.350788408Z Reading package lists...
ai4l-openwebui | 2025-07-30T00:09:13.121076718Z Reading package lists...
ai4l-openwebui | 2025-07-30T00:09:13.300521005Z Building dependency tree...
ai4l-openwebui | 2025-07-30T00:09:13.301012809Z Reading state information...
ai4l-openwebui |

The request that resulted in this “hang” occured at 2025-07-30T00:06:14.311. That request was made about 30 minutes AFTER the previous successful request/response. There was no openweb activity between the previous successful interaction and this one, so openweb was “idle” during this 30 minute gap. It seems like the “hangs” often occur when there is a gap in usage. While attempting to reproduce the issue, I use two clients - one on Mac OS, one from IOS (using Chrome), but it works for many 10s of minutes. If I stop testing, and then begin again (without a reset of the services), I seem to often get this hang right away (but NOT always).

imageCompressionSize': {'width': '', 'height': ''}, 'landingPageMode': '', 'showUsername': True, 'notifications': {'webhook_url': ''}, 'webSearch': None, 'params': {}, 'audio': {'stt': {}, 'tts': {'engineConfig': {}, 'playbackRate': 1, 'voice': 'EXAVITQu4vr4xnSDxMaL', 'defaultVoice': ''}}, 'memory': True}), info=None, oauth_sub=None))] 1 (0.0000)s - {}
ai4l-openwebui | 2025-07-29T23:27:04.101757591Z 2025-07-29 23:27:04.101 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/chat/completed HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-29T23:27:04.940614661Z 2025-07-29 23:27:04.940 | DEBUG | open_webui.socket.main:periodic_usage_pool_cleanup:174 - Cleaning up model 4l---our-company-kb from usage pool - {}
ai4l-openwebui | 2025-07-29T23:27:10.343281379Z 2025-07-29 23:27:10.342 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/v1/chats/700a4282-ac0f-4605-b350-734c5e55fecb HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-29T23:27:10.634581236Z 2025-07-29 23:27:10.634 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-29T23:27:10.711955110Z 2025-07-29 23:27:10.711 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-30T00:06:13.918433448Z 2025-07-30 00:06:13.918 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/v1/chats/700a4282-ac0f-4605-b350-734c5e55fecb HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-30T00:06:14.135890915Z 2025-07-30 00:06:14.135 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-30T00:06:14.220920748Z 2025-07-30 00:06:14.220 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-30T00:06:14.311436675Z 2025-07-30 00:06:14.310 | DEBUG | open_webui.utils.middleware:process_chat_payload:734 - form_data: {'stream': True, 'model': '4l---our-company-kb', 'messages': [{'role': 'user', 'content':

You can see from the WebSocket monitor that during the time of the hang, connections start to stack up.

websocket-monitor | 2025-07-30T00:06:14.334385862Z [2025-07-30 00:06:09] MEM: 457.7MiB / 31.35GiB | CPU: 9.36% | CONN: 2 | WS: 1 | PROC: 1 | FD: 35 | SIO: 0
websocket-monitor | 2025-07-30T00:06:14.334419262Z 0
websocket-monitor | 2025-07-30T00:06:23.385120638Z [2025-07-30 00:06:19] MEM: 457.9MiB / 31.35GiB | CPU: 0.13% | CONN: 3 | WS: 2 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:06:23.385166139Z 0
websocket-monitor | 2025-07-30T00:06:32.480488865Z [2025-07-30 00:06:28] MEM: 457.9MiB / 31.35GiB | CPU: 0.14% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:06:32.480527365Z 0
websocket-monitor | 2025-07-30T00:06:41.534318302Z [2025-07-30 00:06:37] MEM: 460.6MiB / 31.35GiB | CPU: 0.13% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:06:41.534351002Z 0
websocket-monitor | 2025-07-30T00:06:50.558386329Z [2025-07-30 00:06:46] MEM: 460.6MiB / 31.35GiB | CPU: 0.15% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:06:50.558423829Z 0
websocket-monitor | 2025-07-30T00:06:59.631845363Z [2025-07-30 00:06:55] MEM: 460.6MiB / 31.35GiB | CPU: 0.13% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:06:59.631912964Z 0
websocket-monitor | 2025-07-30T00:07:08.682524138Z [2025-07-30 00:07:04] MEM: 460.4MiB / 31.35GiB | CPU: 0.13% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:07:08.682567539Z 0
websocket-monitor | 2025-07-30T00:07:17.751985180Z [2025-07-30 00:07:13] MEM: 460.4MiB / 31.35GiB | CPU: 0.14% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0

Then near the ai4l-openwebui restart, the monitor shows resources being released

websocket-monitor | 2025-07-30T00:08:02.920865190Z [2025-07-30 00:07:58] MEM: 463.1MiB / 31.35GiB | CPU: 0.16% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:02.920909690Z 0
websocket-monitor | 2025-07-30T00:08:11.950835398Z [2025-07-30 00:08:07] MEM: 462.9MiB / 31.35GiB | CPU: 0.14% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:11.950935599Z 0
websocket-monitor | 2025-07-30T00:08:20.999781865Z [2025-07-30 00:08:16] MEM: 462.9MiB / 31.35GiB | CPU: 0.18% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:20.999821765Z 0
websocket-monitor | 2025-07-30T00:08:30.020799431Z [2025-07-30 00:08:26] MEM: 462.9MiB / 31.35GiB | CPU: 0.15% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:30.020842931Z 0
websocket-monitor | 2025-07-30T00:08:39.034728224Z [2025-07-30 00:08:35] MEM: 465.7MiB / 31.35GiB | CPU: 0.16% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:39.034764825Z 0
websocket-monitor | 2025-07-30T00:08:48.080635586Z [2025-07-30 00:08:44] MEM: 465.7MiB / 31.35GiB | CPU: 0.15% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:48.080666086Z 0
websocket-monitor | 2025-07-30T00:08:57.109347335Z [2025-07-30 00:08:53] MEM: 465.7MiB / 31.35GiB | CPU: 0.17% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:57.109387735Z 0
websocket-monitor | 2025-07-30T00:09:06.127351081Z [2025-07-30 00:09:02] MEM: 465.5MiB / 31.35GiB | CPU: 0.16% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:09:06.127390781Z 0
websocket-monitor | 2025-07-30T00:09:15.310945779Z [2025-07-30 00:09:11] MEM: 39.66MiB / 31.35GiB | CPU: 100.50% | CONN: 0
websocket-monitor | 2025-07-30T00:09:15.310979079Z 0 | WS: 0
websocket-monitor | 2025-07-30T00:09:15.310984079Z 0 | PROC: 1 | FD: 23 | SIO: 0
websocket-monitor | 2025-07-30T00:09:15.310987879Z 0
websocket-monitor | 2025-07-30T00:09:24.314871368Z [2025-07-30 00:09:20] MEM: 243.2MiB / 31.35GiB | CPU: 100.11% | CONN: 1 | WS: 0
websocket-monitor | 2025-07-30T00:09:24.314898268Z 0 | PROC: 1 | FD: 27 | SIO: 0
websocket-monitor | 2025-07-30T00:09:24.314902968Z 0
websocket-monitor | 2025-07-30T00:09:33.360325695Z [2025-07-30 00:09:29] MEM: 324.3MiB / 31.35GiB | CPU: 3.91% | CONN: 2 | WS: 1 | PROC: 1 | FD: 30 | SIO: 0

Logs & Screenshots

The last successful log message is from open_webui.retrieval.utils:generate_openai_batch_embeddings

ai4l-openwebui | 2025-07-30T00:06:14.538244099Z 2025-07-30 00:06:14.537 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 1 - {}

Normal logs from successful calls done previously… (this is what I expect to see after each generate_openai_batch_embeddings call

ai4l-openwebui | 2025-07-29T23:26:56.671475035Z 2025-07-29 23:26:56.671 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 3 - {}
ai4l-openwebui | 2025-07-29T23:26:56.993248363Z 2025-07-29 23:26:56.992 | DEBUG | open_webui.retrieval.utils:query_collection:299 - query_collection: processing 3 queries across 1 collections - {}
ai4l-openwebui | 2025-07-29T23:26:56.993677564Z 2025-07-29 23:26:56.993 | DEBUG | open_webui.retrieval.utils:query_doc:85 - query_doc:doc 19c13be5-12f0-41cb-a109-f2826d8f6382 - {}
ai4l-openwebui | 2025-07-29T23:26:56.994781567Z 2025-07-29 23:26:56.994 | DEBUG | open_webui.retrieval.utils:query_doc:85 - query_doc:doc 19c13be5-12f0-41cb-a109-f2826d8f6382 - {}
ai4l-openwebui | 2025-07-29T23:26:56.995842570Z 2025-07-29 23:26:56.995 | DEBUG | open_webui.retrieval.utils:query_doc:85 - query_doc:doc 19c13be5-12f0-41cb-a109-f2826d8f6382 - {}
ai4l-openwebui | 2025-07-29T23:26:57.068472657Z 2025-07-29 23:26:57.067 | INFO | open_webui.retrieval.utils:query_doc:93 - query_doc:result [['76245fe7-22a8-4c31-a3ff-6b500dc48e9b', '0e04f732-264f-48a2-8cf3-860592f318e3', 'f65ed728-a131-4d10-873d-07e44641cede', 'a579a348-86fb-47ca-b00e-9448eb1fbc18', '43ab5288-a960-4da7-917b-4c934d63ddff', '01b21117-6960-4007-8bb6-63935075454a', '1e9e202a-0172-

When the issue occurs : Logs show a “hang and then a restart”. Note the 3 minutes that pass before the restart. My watchdog detects the container is “unhealthy” and restarts it.

nd.openxmlformats-officedocument.wordprocessingml.document', 'size': 36166, 'data': {}, 'collection_name': '19c13be5-12f0-41cb-a109-f2826d8f6382'}, 'created_at': 1749841911, 'updated_at': 1749841911}], 'type': 'collection'}]}, 'access_control': None, 'is_active': True, 'updated_at': 1750898365, 'created_at': 1750898365}, 'preset': True, 'actions': [], 'filters': [], 'tags': []}, 'direct': False}} - {}
ai4l-openwebui | 2025-07-30T00:06:14.538244099Z 2025-07-30 00:06:14.537 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 1 - {}
ai4l-openwebui | 2025-07-30T00:09:11.308691394Z Hit:1 http://deb.debian.org/debian bookworm InRelease
ai4l-openwebui | 2025-07-30T00:09:11.308721894Z Hit:2 http://deb.debian.org/debian bookworm-updates InRelease
ai4l-openwebui | 2025-07-30T00:09:11.308726794Z Hit:3 http://deb.debian.org/debian-security bookworm-security InRelease
ai4l-openwebui | 2025-07-30T00:09:12.350788408Z Reading package lists...
ai4l-openwebui | 2025-07-30T00:09:13.121076718Z Reading package lists...
ai4l-openwebui | 2025-07-30T00:09:13.300521005Z Building dependency tree...
ai4l-openwebui | 2025-07-30T00:09:13.301012809Z Reading state information...
ai4l-openwebui |

The request that resulted in this “hang” occured at 2025-07-30T00:06:14.311. That request was made about 30 minutes AFTER the several previous successful request/responses. There was no openweb activity between the previous successful interaction and this one, so openweb was “idle” during this 30 minute gap. It seems like the “hangs” often occur when there is a gap in usage. While attempting to reproduce the issue, I use two clients - one on Mac OS, one from IOS (using Chrome), but it works for many 10s of minutes. If I stop testing, and then begin again (without a reset of the services), I seem to often get this hang right away (but NOT always).

imageCompressionSize': {'width': '', 'height': ''}, 'landingPageMode': '', 'showUsername': True, 'notifications': {'webhook_url': ''}, 'webSearch': None, 'params': {}, 'audio': {'stt': {}, 'tts': {'engineConfig': {}, 'playbackRate': 1, 'voice': 'EXAVITQu4vr4xnSDxMaL', 'defaultVoice': ''}}, 'memory': True}), info=None, oauth_sub=None))] 1 (0.0000)s - {}
ai4l-openwebui | 2025-07-29T23:27:04.101757591Z 2025-07-29 23:27:04.101 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/chat/completed HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-29T23:27:04.940614661Z 2025-07-29 23:27:04.940 | DEBUG | open_webui.socket.main:periodic_usage_pool_cleanup:174 - Cleaning up model 4l---our-company-kb from usage pool - {}
ai4l-openwebui | 2025-07-29T23:27:10.343281379Z 2025-07-29 23:27:10.342 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/v1/chats/700a4282-ac0f-4605-b350-734c5e55fecb HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-29T23:27:10.634581236Z 2025-07-29 23:27:10.634 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-29T23:27:10.711955110Z 2025-07-29 23:27:10.711 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-30T00:06:13.918433448Z 2025-07-30 00:06:13.918 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/v1/chats/700a4282-ac0f-4605-b350-734c5e55fecb HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-30T00:06:14.135890915Z 2025-07-30 00:06:14.135 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-30T00:06:14.220920748Z 2025-07-30 00:06:14.220 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {}
ai4l-openwebui | 2025-07-30T00:06:14.311436675Z 2025-07-30 00:06:14.310 | DEBUG | open_webui.utils.middleware:process_chat_payload:734 - form_data: {'stream': True, 'model': '4l---our-company-kb', 'messages': [{'role': 'user', 'content':

You can see from the WebSocket monitor that during the time of the hang, connections start to stack up (but NOT a large amount, so I don't see this as causation).

websocket-monitor | 2025-07-30T00:06:14.334385862Z [2025-07-30 00:06:09] MEM: 457.7MiB / 31.35GiB | CPU: 9.36% | CONN: 2 | WS: 1 | PROC: 1 | FD: 35 | SIO: 0
websocket-monitor | 2025-07-30T00:06:14.334419262Z 0
websocket-monitor | 2025-07-30T00:06:23.385120638Z [2025-07-30 00:06:19] MEM: 457.9MiB / 31.35GiB | CPU: 0.13% | CONN: 3 | WS: 2 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:06:23.385166139Z 0
websocket-monitor | 2025-07-30T00:06:32.480488865Z [2025-07-30 00:06:28] MEM: 457.9MiB / 31.35GiB | CPU: 0.14% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:06:32.480527365Z 0
websocket-monitor | 2025-07-30T00:06:41.534318302Z [2025-07-30 00:06:37] MEM: 460.6MiB / 31.35GiB | CPU: 0.13% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:06:41.534351002Z 0
websocket-monitor | 2025-07-30T00:06:50.558386329Z [2025-07-30 00:06:46] MEM: 460.6MiB / 31.35GiB | CPU: 0.15% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:06:50.558423829Z 0
websocket-monitor | 2025-07-30T00:06:59.631845363Z [2025-07-30 00:06:55] MEM: 460.6MiB / 31.35GiB | CPU: 0.13% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:06:59.631912964Z 0
websocket-monitor | 2025-07-30T00:07:08.682524138Z [2025-07-30 00:07:04] MEM: 460.4MiB / 31.35GiB | CPU: 0.13% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:07:08.682567539Z 0
websocket-monitor | 2025-07-30T00:07:17.751985180Z [2025-07-30 00:07:13] MEM: 460.4MiB / 31.35GiB | CPU: 0.14% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0

Then near the ai4l-openwebui restart, the monitor shows resources being released

websocket-monitor | 2025-07-30T00:08:02.920865190Z [2025-07-30 00:07:58] MEM: 463.1MiB / 31.35GiB | CPU: 0.16% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:02.920909690Z 0
websocket-monitor | 2025-07-30T00:08:11.950835398Z [2025-07-30 00:08:07] MEM: 462.9MiB / 31.35GiB | CPU: 0.14% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:11.950935599Z 0
websocket-monitor | 2025-07-30T00:08:20.999781865Z [2025-07-30 00:08:16] MEM: 462.9MiB / 31.35GiB | CPU: 0.18% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:20.999821765Z 0
websocket-monitor | 2025-07-30T00:08:30.020799431Z [2025-07-30 00:08:26] MEM: 462.9MiB / 31.35GiB | CPU: 0.15% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:30.020842931Z 0
websocket-monitor | 2025-07-30T00:08:39.034728224Z [2025-07-30 00:08:35] MEM: 465.7MiB / 31.35GiB | CPU: 0.16% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:39.034764825Z 0
websocket-monitor | 2025-07-30T00:08:48.080635586Z [2025-07-30 00:08:44] MEM: 465.7MiB / 31.35GiB | CPU: 0.15% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:48.080666086Z 0
websocket-monitor | 2025-07-30T00:08:57.109347335Z [2025-07-30 00:08:53] MEM: 465.7MiB / 31.35GiB | CPU: 0.17% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:08:57.109387735Z 0
websocket-monitor | 2025-07-30T00:09:06.127351081Z [2025-07-30 00:09:02] MEM: 465.5MiB / 31.35GiB | CPU: 0.16% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor | 2025-07-30T00:09:06.127390781Z 0
websocket-monitor | 2025-07-30T00:09:15.310945779Z [2025-07-30 00:09:11] MEM: 39.66MiB / 31.35GiB | CPU: 100.50% | CONN: 0
websocket-monitor | 2025-07-30T00:09:15.310979079Z 0 | WS: 0
websocket-monitor | 2025-07-30T00:09:15.310984079Z 0 | PROC: 1 | FD: 23 | SIO: 0
websocket-monitor | 2025-07-30T00:09:15.310987879Z 0
websocket-monitor | 2025-07-30T00:09:24.314871368Z [2025-07-30 00:09:20] MEM: 243.2MiB / 31.35GiB | CPU: 100.11% | CONN: 1 | WS: 0
websocket-monitor | 2025-07-30T00:09:24.314898268Z 0 | PROC: 1 | FD: 27 | SIO: 0
websocket-monitor | 2025-07-30T00:09:24.314902968Z 0
websocket-monitor | 2025-07-30T00:09:33.360325695Z [2025-07-30 00:09:29] MEM: 324.3MiB / 31.35GiB | CPU: 3.91% | CONN: 2 | WS: 1 | PROC: 1 | FD: 30 | SIO: 0

Additional Information

Here is my nginx configuration file.

events {
worker_connections 1024;
}

http {
include /etc/nginx/mime.types;
default_type application/octet-stream;

log_format main '$remote_addr - $remote_user [$time_local] "$request" '
                '$status $body_bytes_sent "$http_referer" '
                '"$http_user_agent" "$http_x_forwarded_for"';

access_log /var/log/nginx/access.log main;
error_log /var/log/nginx/error.log debug;

sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;

# Allow larger uploads for OpenWebUI
client_max_body_size 100M;

# Rate limiting to prevent abuse
limit_req_zone $binary_remote_addr zone=api:10m rate=10r/s;
limit_req_zone $binary_remote_addr zone=login:10m rate=1r/s;

# Block known malicious IPs
deny 78.153.140.203;

# HTTP server - redirect to HTTPS
server {
    listen 80;
    listen 8080;
    server_name my.web.site nnn.nnn.nnn.nnn;

    location / {
        return 301 https://$host$request_uri;
    }
}

# HTTPS server for legitimate domain
server {
    listen 443 ssl http2;
    server_name my.web.site;

    # SSL configuration
    ssl_certificate /etc/letsencrypt/live/my.web.site/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/my.web.site/privkey.pem;
    
    # SSL settings for security
    ssl_protocols TLSv1.2 TLSv1.3;
    ssl_ciphers ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA384;
    ssl_prefer_server_ciphers on;
    ssl_session_cache shared:SSL:10m;
    ssl_session_timeout 10m;

    # Security headers
    add_header X-Frame-Options "SAMEORIGIN" always;
    add_header X-Content-Type-Options "nosniff" always;
    add_header X-XSS-Protection "1; mode=block" always;
    add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;

    # Block access to sensitive files
    location ~ /\.(env|git|svn|htaccess|htpasswd) {
        deny all;
        return 404;
    }

    location ~ \.(yml|yaml|json|xml|log|bak|backup|old|txt)$ {
        deny all;
        return 404;
    }

    # Block PHP files (since you're not using PHP)
    location ~ \.php$ {
        deny all;
        return 404;
    }

    # Apply rate limiting to main application
    location / {
        limit_req zone=api burst=20 nodelay;
        
        proxy_pass http://ai4l-openwebui:8080;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_set_header X-Forwarded-Host $host;
        proxy_set_header X-Forwarded-Port $server_port;
        
        # WebSocket support
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
        
        # (Optional) Disable proxy buffering for better streaming response from models
        proxy_buffering off;
        
        # Timeouts
        proxy_connect_timeout 60s;
        proxy_send_timeout 60s;
        proxy_read_timeout 10m;

        proxy_cache_bypass $http_upgrade;
    }
}

# HTTPS server for IP access - block all suspicious requests
server {
    listen 443 ssl http2;
    server_name nnn.nnn.nnn.nnn;

    # Use same SSL certificate (or create a self-signed one)
    ssl_certificate /etc/letsencrypt/live/my.web.site/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/my.web.site/privkey.pem;
    
    ssl_protocols TLSv1.2 TLSv1.3;
    ssl_ciphers ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384;

    # Block all requests to IP-based access to prevent scanning
    location / {
        return 444;  # Close connection without response
    }
}

# Catch-all server for any other requests
server {
    listen 443 ssl http2 default_server;
    server_name _;

    # Dummy SSL certificate
    ssl_certificate /etc/letsencrypt/live/my.web.site/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/my.web.site/privkey.pem;
    
    ssl_protocols TLSv1.2 TLSv1.3;
    ssl_ciphers ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384;

    return 444;  # Close connection without response
}
}
Originally created by @BAngelis on GitHub (Jul 30, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/16158 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.6.18 ### Ollama Version (if applicable) n/a ### Operating System Ubuntu 22.04 LTS ### Browser (if applicable) Chrome 138.0.7204.169 (Official Build) (arm64) ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [ ] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior After tens of minutes of having successful request responses, once leaving the system idle for 30 minutes and making a request to OpenWeb UI you expect a response back to the browser, but it never comes. ### Actual Behavior After tens of minutes of having successful request/response handled, once leaving the system idle for 30 minutes and making a request to OpenWeb UI does not return a response to the browser, and the browser app eventually times out. Looking at the openwebui log entries, you can see the last log entry is : ai4l-openwebui | 2025-07-30T00:06:14.538244099Z 2025-07-30 00:06:14.537 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 1 - {} Then, 3 minutes later the openwebui container goes "unhealthy" and is restarted by my watchdog. Here are the logs leading up to the issue: ### Steps to Reproduce **Overall deployment environment:** - Azure VM running Ubuntu 22.04 - Docker installed on the VM. - nginx configured as a reverse proxy to support https - Pinecone as the vector store - Set DEBUG - OpenAI used for LLM and embeddings model - Two Client browsers running on MacOS and IOS **Steps to reproduce** - Using two clients, one from IOS Chrome and one from MacOS Chrome, I issue several chat requests and receive responses from models configured to use my Knowledge bases in Pinecone. - After a variable number (>10) of successful interactions, let the system go idle by not sending any requests for 30+ minutes. - Issue a chat request and the request never returns. Eventually the browser displays an error message. - Look at the Docker logs (with DEBUG enabled) and see that that request processing started but never logged any more progress after the call to open_webui.retrieval.utils:generate_openai_batch_embeddings - After 3 minutes of no logging from Openweb UI, the container is restarted by my watchdog **Here is my docker-compose.yml** # Define the logging configuration as an anchor x-logging: &default-logging driver: json-file options: max-size: "5m" max-file: "3" compress: "true" services: openwebui: image: ghcr.io/open-webui/open-webui:latest container_name: ai4l-openwebui logging: *default-logging environment: - GLOBAL_LOG_LEVEL=DEBUG - VECTOR_DB=pinecone - PINECONE_API_KEY=removed - PINECONE_ENVIRONMENT=eastus2 - PINECONE_INDEX_NAME=ai4l - PINECONE_DIMENSION=1536 - PINECONE_METRIC=cosine - PINECONE_CLOUD=azure - PORT=8080 - OAUTH_ENABLED=false - WEBUI_ALLOW_REGISTRATION=false - RAG_EMBEDDING_OPENAI_BATCH_SIZE=16 - RAG_EMBEDDING_BATCH_SIZE=16 volumes: - /mnt_openwebui_data/data:/app/backend/data - /mnt_openwebui_data/config:/app/backend/config # Add net-tools installation command: > sh -c " apt-get update && apt-get install -y net-tools && cd /app/backend && exec bash start.sh " restart: unless-stopped networks: - openwebui-network nginx: image: nginx:latest container_name: openwebui-nginx logging: *default-logging volumes: - ./nginx.conf:/etc/nginx/nginx.conf - /etc/letsencrypt:/etc/letsencrypt:ro ports: - "80:80" - "443:443" restart: unless-stopped networks: - openwebui-network depends_on: - openwebui # WebSocket Monitoring Service system-monitor: image: alpine:latest container_name: websocket-monitor logging: *default-logging volumes: - /var/run/docker.sock:/var/run/docker.sock:ro - ./app/watchstats.sh:/app/watchstats.sh:ro - ./monitor_logs:/app/logs command: > sh -c " apk add --no-cache docker && cd /app && exec sh ./watchstats.sh " restart: unless-stopped networks: - openwebui-network depends_on: - openwebui networks: openwebui-network: driver: bridge **When the issue occurs : Logs show a “hang and then a restart”. Note the 3 minutes that pass before the restart. My watchdog detects the container is “unhealthy” and restarts it.** <snip> nd.openxmlformats-officedocument.wordprocessingml.document', 'size': 36166, 'data': {}, 'collection_name': '19c13be5-12f0-41cb-a109-f2826d8f6382'}, 'created_at': 1749841911, 'updated_at': 1749841911}], 'type': 'collection'}]}, 'access_control': None, 'is_active': True, 'updated_at': 1750898365, 'created_at': 1750898365}, 'preset': True, 'actions': [], 'filters': [], 'tags': []}, 'direct': False}} - {} ai4l-openwebui | 2025-07-30T00:06:14.538244099Z 2025-07-30 00:06:14.537 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 1 - {} ai4l-openwebui | 2025-07-30T00:09:11.308691394Z Hit:1 http://deb.debian.org/debian bookworm InRelease ai4l-openwebui | 2025-07-30T00:09:11.308721894Z Hit:2 http://deb.debian.org/debian bookworm-updates InRelease ai4l-openwebui | 2025-07-30T00:09:11.308726794Z Hit:3 http://deb.debian.org/debian-security bookworm-security InRelease ai4l-openwebui | 2025-07-30T00:09:12.350788408Z Reading package lists... ai4l-openwebui | 2025-07-30T00:09:13.121076718Z Reading package lists... ai4l-openwebui | 2025-07-30T00:09:13.300521005Z Building dependency tree... ai4l-openwebui | 2025-07-30T00:09:13.301012809Z Reading state information... ai4l-openwebui |</snip> **The request that resulted in this “hang” occured at 2025-07-30T00:06:14.311. That request was made about 30 minutes AFTER the previous successful request/response.** There was no openweb activity between the previous successful interaction and this one, so openweb was “idle” during this 30 minute gap. It seems like the “hangs” often occur when there is a gap in usage. While attempting to reproduce the issue, I use two clients - one on Mac OS, one from IOS (using Chrome), but it works for many 10s of minutes. If I stop testing, and then begin again (without a reset of the services), I seem to often get this hang right away (but NOT always). <snip> imageCompressionSize': {'width': '', 'height': ''}, 'landingPageMode': '', 'showUsername': True, 'notifications': {'webhook_url': ''}, 'webSearch': None, 'params': {}, 'audio': {'stt': {}, 'tts': {'engineConfig': {}, 'playbackRate': 1, 'voice': 'EXAVITQu4vr4xnSDxMaL', 'defaultVoice': ''}}, 'memory': True}), info=None, oauth_sub=None))] 1 (0.0000)s - {} ai4l-openwebui | 2025-07-29T23:27:04.101757591Z 2025-07-29 23:27:04.101 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/chat/completed HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-29T23:27:04.940614661Z 2025-07-29 23:27:04.940 | DEBUG | open_webui.socket.main:periodic_usage_pool_cleanup:174 - Cleaning up model 4l---our-company-kb from usage pool - {} ai4l-openwebui | 2025-07-29T23:27:10.343281379Z 2025-07-29 23:27:10.342 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/v1/chats/700a4282-ac0f-4605-b350-734c5e55fecb HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-29T23:27:10.634581236Z 2025-07-29 23:27:10.634 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-29T23:27:10.711955110Z 2025-07-29 23:27:10.711 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-30T00:06:13.918433448Z 2025-07-30 00:06:13.918 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/v1/chats/700a4282-ac0f-4605-b350-734c5e55fecb HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-30T00:06:14.135890915Z 2025-07-30 00:06:14.135 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-30T00:06:14.220920748Z 2025-07-30 00:06:14.220 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-30T00:06:14.311436675Z 2025-07-30 00:06:14.310 | DEBUG | open_webui.utils.middleware:process_chat_payload:734 - form_data: {'stream': True, 'model': '4l---our-company-kb', 'messages': [{'role': 'user', 'content': </snip> You can see from the WebSocket monitor that during the time of the hang, connections start to stack up. websocket-monitor | 2025-07-30T00:06:14.334385862Z [2025-07-30 00:06:09] MEM: 457.7MiB / 31.35GiB | CPU: 9.36% | CONN: 2 | WS: 1 | PROC: 1 | FD: 35 | SIO: 0 websocket-monitor | 2025-07-30T00:06:14.334419262Z 0 websocket-monitor | 2025-07-30T00:06:23.385120638Z [2025-07-30 00:06:19] MEM: 457.9MiB / 31.35GiB | CPU: 0.13% | CONN: 3 | WS: 2 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:06:23.385166139Z 0 websocket-monitor | 2025-07-30T00:06:32.480488865Z [2025-07-30 00:06:28] MEM: 457.9MiB / 31.35GiB | CPU: 0.14% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:06:32.480527365Z 0 websocket-monitor | 2025-07-30T00:06:41.534318302Z [2025-07-30 00:06:37] MEM: 460.6MiB / 31.35GiB | CPU: 0.13% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:06:41.534351002Z 0 websocket-monitor | 2025-07-30T00:06:50.558386329Z [2025-07-30 00:06:46] MEM: 460.6MiB / 31.35GiB | CPU: 0.15% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:06:50.558423829Z 0 websocket-monitor | 2025-07-30T00:06:59.631845363Z [2025-07-30 00:06:55] MEM: 460.6MiB / 31.35GiB | CPU: 0.13% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:06:59.631912964Z 0 websocket-monitor | 2025-07-30T00:07:08.682524138Z [2025-07-30 00:07:04] MEM: 460.4MiB / 31.35GiB | CPU: 0.13% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:07:08.682567539Z 0 websocket-monitor | 2025-07-30T00:07:17.751985180Z [2025-07-30 00:07:13] MEM: 460.4MiB / 31.35GiB | CPU: 0.14% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0 Then near the ai4l-openwebui restart, the monitor shows resources being released websocket-monitor | 2025-07-30T00:08:02.920865190Z [2025-07-30 00:07:58] MEM: 463.1MiB / 31.35GiB | CPU: 0.16% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:02.920909690Z 0 websocket-monitor | 2025-07-30T00:08:11.950835398Z [2025-07-30 00:08:07] MEM: 462.9MiB / 31.35GiB | CPU: 0.14% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:11.950935599Z 0 websocket-monitor | 2025-07-30T00:08:20.999781865Z [2025-07-30 00:08:16] MEM: 462.9MiB / 31.35GiB | CPU: 0.18% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:20.999821765Z 0 websocket-monitor | 2025-07-30T00:08:30.020799431Z [2025-07-30 00:08:26] MEM: 462.9MiB / 31.35GiB | CPU: 0.15% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:30.020842931Z 0 websocket-monitor | 2025-07-30T00:08:39.034728224Z [2025-07-30 00:08:35] MEM: 465.7MiB / 31.35GiB | CPU: 0.16% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:39.034764825Z 0 websocket-monitor | 2025-07-30T00:08:48.080635586Z [2025-07-30 00:08:44] MEM: 465.7MiB / 31.35GiB | CPU: 0.15% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:48.080666086Z 0 websocket-monitor | 2025-07-30T00:08:57.109347335Z [2025-07-30 00:08:53] MEM: 465.7MiB / 31.35GiB | CPU: 0.17% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:57.109387735Z 0 websocket-monitor | 2025-07-30T00:09:06.127351081Z [2025-07-30 00:09:02] MEM: 465.5MiB / 31.35GiB | CPU: 0.16% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:09:06.127390781Z 0 websocket-monitor | 2025-07-30T00:09:15.310945779Z [2025-07-30 00:09:11] MEM: 39.66MiB / 31.35GiB | CPU: 100.50% | CONN: 0 websocket-monitor | 2025-07-30T00:09:15.310979079Z 0 | WS: 0 websocket-monitor | 2025-07-30T00:09:15.310984079Z 0 | PROC: 1 | FD: 23 | SIO: 0 websocket-monitor | 2025-07-30T00:09:15.310987879Z 0 websocket-monitor | 2025-07-30T00:09:24.314871368Z [2025-07-30 00:09:20] MEM: 243.2MiB / 31.35GiB | CPU: 100.11% | CONN: 1 | WS: 0 websocket-monitor | 2025-07-30T00:09:24.314898268Z 0 | PROC: 1 | FD: 27 | SIO: 0 websocket-monitor | 2025-07-30T00:09:24.314902968Z 0 websocket-monitor | 2025-07-30T00:09:33.360325695Z [2025-07-30 00:09:29] MEM: 324.3MiB / 31.35GiB | CPU: 3.91% | CONN: 2 | WS: 1 | PROC: 1 | FD: 30 | SIO: 0 ### Logs & Screenshots **The last successful log message is from open_webui.retrieval.utils:generate_openai_batch_embeddings** ai4l-openwebui | 2025-07-30T00:06:14.538244099Z 2025-07-30 00:06:14.537 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 1 - {} **Normal logs from successful calls done previously… (this is what I expect to see after each generate_openai_batch_embeddings call** <snip> ai4l-openwebui | 2025-07-29T23:26:56.671475035Z 2025-07-29 23:26:56.671 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 3 - {} ai4l-openwebui | 2025-07-29T23:26:56.993248363Z 2025-07-29 23:26:56.992 | DEBUG | open_webui.retrieval.utils:query_collection:299 - query_collection: processing 3 queries across 1 collections - {} ai4l-openwebui | 2025-07-29T23:26:56.993677564Z 2025-07-29 23:26:56.993 | DEBUG | open_webui.retrieval.utils:query_doc:85 - query_doc:doc 19c13be5-12f0-41cb-a109-f2826d8f6382 - {} ai4l-openwebui | 2025-07-29T23:26:56.994781567Z 2025-07-29 23:26:56.994 | DEBUG | open_webui.retrieval.utils:query_doc:85 - query_doc:doc 19c13be5-12f0-41cb-a109-f2826d8f6382 - {} ai4l-openwebui | 2025-07-29T23:26:56.995842570Z 2025-07-29 23:26:56.995 | DEBUG | open_webui.retrieval.utils:query_doc:85 - query_doc:doc 19c13be5-12f0-41cb-a109-f2826d8f6382 - {} ai4l-openwebui | 2025-07-29T23:26:57.068472657Z 2025-07-29 23:26:57.067 | INFO | open_webui.retrieval.utils:query_doc:93 - query_doc:result [['76245fe7-22a8-4c31-a3ff-6b500dc48e9b', '0e04f732-264f-48a2-8cf3-860592f318e3', 'f65ed728-a131-4d10-873d-07e44641cede', 'a579a348-86fb-47ca-b00e-9448eb1fbc18', '43ab5288-a960-4da7-917b-4c934d63ddff', '01b21117-6960-4007-8bb6-63935075454a', '1e9e202a-0172-</snip> **When the issue occurs : Logs show a “hang and then a restart”. Note the 3 minutes that pass before the restart. My watchdog detects the container is “unhealthy” and restarts it.** <snip> nd.openxmlformats-officedocument.wordprocessingml.document', 'size': 36166, 'data': {}, 'collection_name': '19c13be5-12f0-41cb-a109-f2826d8f6382'}, 'created_at': 1749841911, 'updated_at': 1749841911}], 'type': 'collection'}]}, 'access_control': None, 'is_active': True, 'updated_at': 1750898365, 'created_at': 1750898365}, 'preset': True, 'actions': [], 'filters': [], 'tags': []}, 'direct': False}} - {} ai4l-openwebui | 2025-07-30T00:06:14.538244099Z 2025-07-30 00:06:14.537 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 1 - {} ai4l-openwebui | 2025-07-30T00:09:11.308691394Z Hit:1 http://deb.debian.org/debian bookworm InRelease ai4l-openwebui | 2025-07-30T00:09:11.308721894Z Hit:2 http://deb.debian.org/debian bookworm-updates InRelease ai4l-openwebui | 2025-07-30T00:09:11.308726794Z Hit:3 http://deb.debian.org/debian-security bookworm-security InRelease ai4l-openwebui | 2025-07-30T00:09:12.350788408Z Reading package lists... ai4l-openwebui | 2025-07-30T00:09:13.121076718Z Reading package lists... ai4l-openwebui | 2025-07-30T00:09:13.300521005Z Building dependency tree... ai4l-openwebui | 2025-07-30T00:09:13.301012809Z Reading state information... ai4l-openwebui |</snip> **The request that resulted in this “hang” occured at 2025-07-30T00:06:14.311. That request was made about 30 minutes AFTER the several previous successful request/responses.** There was no openweb activity between the previous successful interaction and this one, so openweb was “idle” during this 30 minute gap. It seems like the “hangs” often occur when there is a gap in usage. While attempting to reproduce the issue, I use two clients - one on Mac OS, one from IOS (using Chrome), but it works for many 10s of minutes. If I stop testing, and then begin again (without a reset of the services), I seem to often get this hang right away (but NOT always). <snip> imageCompressionSize': {'width': '', 'height': ''}, 'landingPageMode': '', 'showUsername': True, 'notifications': {'webhook_url': ''}, 'webSearch': None, 'params': {}, 'audio': {'stt': {}, 'tts': {'engineConfig': {}, 'playbackRate': 1, 'voice': 'EXAVITQu4vr4xnSDxMaL', 'defaultVoice': ''}}, 'memory': True}), info=None, oauth_sub=None))] 1 (0.0000)s - {} ai4l-openwebui | 2025-07-29T23:27:04.101757591Z 2025-07-29 23:27:04.101 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/chat/completed HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-29T23:27:04.940614661Z 2025-07-29 23:27:04.940 | DEBUG | open_webui.socket.main:periodic_usage_pool_cleanup:174 - Cleaning up model 4l---our-company-kb from usage pool - {} ai4l-openwebui | 2025-07-29T23:27:10.343281379Z 2025-07-29 23:27:10.342 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/v1/chats/700a4282-ac0f-4605-b350-734c5e55fecb HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-29T23:27:10.634581236Z 2025-07-29 23:27:10.634 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-29T23:27:10.711955110Z 2025-07-29 23:27:10.711 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-30T00:06:13.918433448Z 2025-07-30 00:06:13.918 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "POST /api/v1/chats/700a4282-ac0f-4605-b350-734c5e55fecb HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-30T00:06:14.135890915Z 2025-07-30 00:06:14.135 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-30T00:06:14.220920748Z 2025-07-30 00:06:14.220 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 98.97.33.128:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {} ai4l-openwebui | 2025-07-30T00:06:14.311436675Z 2025-07-30 00:06:14.310 | DEBUG | open_webui.utils.middleware:process_chat_payload:734 - form_data: {'stream': True, 'model': '4l---our-company-kb', 'messages': [{'role': 'user', 'content': </snip> You can see from the WebSocket monitor that during the time of the hang, connections start to stack up (but NOT a large amount, so I don't see this as causation). websocket-monitor | 2025-07-30T00:06:14.334385862Z [2025-07-30 00:06:09] MEM: 457.7MiB / 31.35GiB | CPU: 9.36% | CONN: 2 | WS: 1 | PROC: 1 | FD: 35 | SIO: 0 websocket-monitor | 2025-07-30T00:06:14.334419262Z 0 websocket-monitor | 2025-07-30T00:06:23.385120638Z [2025-07-30 00:06:19] MEM: 457.9MiB / 31.35GiB | CPU: 0.13% | CONN: 3 | WS: 2 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:06:23.385166139Z 0 websocket-monitor | 2025-07-30T00:06:32.480488865Z [2025-07-30 00:06:28] MEM: 457.9MiB / 31.35GiB | CPU: 0.14% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:06:32.480527365Z 0 websocket-monitor | 2025-07-30T00:06:41.534318302Z [2025-07-30 00:06:37] MEM: 460.6MiB / 31.35GiB | CPU: 0.13% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:06:41.534351002Z 0 websocket-monitor | 2025-07-30T00:06:50.558386329Z [2025-07-30 00:06:46] MEM: 460.6MiB / 31.35GiB | CPU: 0.15% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:06:50.558423829Z 0 websocket-monitor | 2025-07-30T00:06:59.631845363Z [2025-07-30 00:06:55] MEM: 460.6MiB / 31.35GiB | CPU: 0.13% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:06:59.631912964Z 0 websocket-monitor | 2025-07-30T00:07:08.682524138Z [2025-07-30 00:07:04] MEM: 460.4MiB / 31.35GiB | CPU: 0.13% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:07:08.682567539Z 0 websocket-monitor | 2025-07-30T00:07:17.751985180Z [2025-07-30 00:07:13] MEM: 460.4MiB / 31.35GiB | CPU: 0.14% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0 Then near the ai4l-openwebui restart, the monitor shows resources being released websocket-monitor | 2025-07-30T00:08:02.920865190Z [2025-07-30 00:07:58] MEM: 463.1MiB / 31.35GiB | CPU: 0.16% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:02.920909690Z 0 websocket-monitor | 2025-07-30T00:08:11.950835398Z [2025-07-30 00:08:07] MEM: 462.9MiB / 31.35GiB | CPU: 0.14% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:11.950935599Z 0 websocket-monitor | 2025-07-30T00:08:20.999781865Z [2025-07-30 00:08:16] MEM: 462.9MiB / 31.35GiB | CPU: 0.18% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:20.999821765Z 0 websocket-monitor | 2025-07-30T00:08:30.020799431Z [2025-07-30 00:08:26] MEM: 462.9MiB / 31.35GiB | CPU: 0.15% | CONN: 7 | WS: 6 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:30.020842931Z 0 websocket-monitor | 2025-07-30T00:08:39.034728224Z [2025-07-30 00:08:35] MEM: 465.7MiB / 31.35GiB | CPU: 0.16% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:39.034764825Z 0 websocket-monitor | 2025-07-30T00:08:48.080635586Z [2025-07-30 00:08:44] MEM: 465.7MiB / 31.35GiB | CPU: 0.15% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:48.080666086Z 0 websocket-monitor | 2025-07-30T00:08:57.109347335Z [2025-07-30 00:08:53] MEM: 465.7MiB / 31.35GiB | CPU: 0.17% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:08:57.109387735Z 0 websocket-monitor | 2025-07-30T00:09:06.127351081Z [2025-07-30 00:09:02] MEM: 465.5MiB / 31.35GiB | CPU: 0.16% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 2025-07-30T00:09:06.127390781Z 0 websocket-monitor | 2025-07-30T00:09:15.310945779Z [2025-07-30 00:09:11] MEM: 39.66MiB / 31.35GiB | CPU: 100.50% | CONN: 0 websocket-monitor | 2025-07-30T00:09:15.310979079Z 0 | WS: 0 websocket-monitor | 2025-07-30T00:09:15.310984079Z 0 | PROC: 1 | FD: 23 | SIO: 0 websocket-monitor | 2025-07-30T00:09:15.310987879Z 0 websocket-monitor | 2025-07-30T00:09:24.314871368Z [2025-07-30 00:09:20] MEM: 243.2MiB / 31.35GiB | CPU: 100.11% | CONN: 1 | WS: 0 websocket-monitor | 2025-07-30T00:09:24.314898268Z 0 | PROC: 1 | FD: 27 | SIO: 0 websocket-monitor | 2025-07-30T00:09:24.314902968Z 0 websocket-monitor | 2025-07-30T00:09:33.360325695Z [2025-07-30 00:09:29] MEM: 324.3MiB / 31.35GiB | CPU: 3.91% | CONN: 2 | WS: 1 | PROC: 1 | FD: 30 | SIO: 0 ### Additional Information Here is my nginx configuration file. events { worker_connections 1024; } http { include /etc/nginx/mime.types; default_type application/octet-stream; log_format main '$remote_addr - $remote_user [$time_local] "$request" ' '$status $body_bytes_sent "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; access_log /var/log/nginx/access.log main; error_log /var/log/nginx/error.log debug; sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65; types_hash_max_size 2048; # Allow larger uploads for OpenWebUI client_max_body_size 100M; # Rate limiting to prevent abuse limit_req_zone $binary_remote_addr zone=api:10m rate=10r/s; limit_req_zone $binary_remote_addr zone=login:10m rate=1r/s; # Block known malicious IPs deny 78.153.140.203; # HTTP server - redirect to HTTPS server { listen 80; listen 8080; server_name my.web.site nnn.nnn.nnn.nnn; location / { return 301 https://$host$request_uri; } } # HTTPS server for legitimate domain server { listen 443 ssl http2; server_name my.web.site; # SSL configuration ssl_certificate /etc/letsencrypt/live/my.web.site/fullchain.pem; ssl_certificate_key /etc/letsencrypt/live/my.web.site/privkey.pem; # SSL settings for security ssl_protocols TLSv1.2 TLSv1.3; ssl_ciphers ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA384; ssl_prefer_server_ciphers on; ssl_session_cache shared:SSL:10m; ssl_session_timeout 10m; # Security headers add_header X-Frame-Options "SAMEORIGIN" always; add_header X-Content-Type-Options "nosniff" always; add_header X-XSS-Protection "1; mode=block" always; add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always; # Block access to sensitive files location ~ /\.(env|git|svn|htaccess|htpasswd) { deny all; return 404; } location ~ \.(yml|yaml|json|xml|log|bak|backup|old|txt)$ { deny all; return 404; } # Block PHP files (since you're not using PHP) location ~ \.php$ { deny all; return 404; } # Apply rate limiting to main application location / { limit_req zone=api burst=20 nodelay; proxy_pass http://ai4l-openwebui:8080; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header X-Forwarded-Host $host; proxy_set_header X-Forwarded-Port $server_port; # WebSocket support proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection "upgrade"; # (Optional) Disable proxy buffering for better streaming response from models proxy_buffering off; # Timeouts proxy_connect_timeout 60s; proxy_send_timeout 60s; proxy_read_timeout 10m; proxy_cache_bypass $http_upgrade; } } # HTTPS server for IP access - block all suspicious requests server { listen 443 ssl http2; server_name nnn.nnn.nnn.nnn; # Use same SSL certificate (or create a self-signed one) ssl_certificate /etc/letsencrypt/live/my.web.site/fullchain.pem; ssl_certificate_key /etc/letsencrypt/live/my.web.site/privkey.pem; ssl_protocols TLSv1.2 TLSv1.3; ssl_ciphers ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384; # Block all requests to IP-based access to prevent scanning location / { return 444; # Close connection without response } } # Catch-all server for any other requests server { listen 443 ssl http2 default_server; server_name _; # Dummy SSL certificate ssl_certificate /etc/letsencrypt/live/my.web.site/fullchain.pem; ssl_certificate_key /etc/letsencrypt/live/my.web.site/privkey.pem; ssl_protocols TLSv1.2 TLSv1.3; ssl_ciphers ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384; return 444; # Close connection without response } }
GiteaMirror added the bug label 2026-05-05 19:29:18 -05:00
Author
Owner

@tjbck commented on GitHub (Jul 31, 2025):

#15023

<!-- gh-comment-id:3139591437 --> @tjbck commented on GitHub (Jul 31, 2025): #15023
Author
Owner

@BAngelis commented on GitHub (Aug 1, 2025):

@tjbck thanks for looking at my issue. I added a AIOHTTP_CLIENT_TIMEOUT=30 to the environment (for a fail fast), but there is no change. Openweb UI doesnt log anything new (after the open_webui.retrieval.utils:generate_openai_batch_embeddings call) for 10+ minutes. It hangs up all other connected clients, nobody can log out or in. Then suddenly, after 17 minutes in this case, it starts responding to new requests.

I've been struggling with this issue for weeks. Where else can I get some help?!

Here's the logs after the AIOHTTP_CLIENT_TIMEOUT change..

<snip>
, 'actions': [], 'filters': [], 'tags': []}, 'direct': False}} - {}
ai4l-openwebui     | 2025-08-01 19:51:45.310 | DEBUG    | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 1 - {}
websocket-monitor  | [2025-08-01 19:51:41] MEM: 387.8MiB / 31.35GiB | CPU: 0.31% | CONN: 2 | WS: 1 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:51:50] MEM: 393.4MiB / 31.35GiB | CPU: 0.20% | CONN: 3 | WS: 2 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:51:59] MEM: 393.4MiB / 31.35GiB | CPU: 0.17% | CONN: 3 | WS: 2 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:52:08] MEM: 396.1MiB / 31.35GiB | CPU: 0.18% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:52:12 +0000] "GET /_app/version.json HTTP/2.0" 404 555 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
websocket-monitor  | [2025-08-01 19:52:17] MEM: 396.1MiB / 31.35GiB | CPU: 0.20% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:52:26] MEM: 396.1MiB / 31.35GiB | CPU: 0.18% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:52:35] MEM: 396.1MiB / 31.35GiB | CPU: 0.21% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 2025/08/01 19:52:45 [error] 30#30: *1 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "POST /api/chat/completions HTTP/2.0", upstream: "http://172.19.0.2:8080/api/chat/completions", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:52:45 +0000] "POST /api/chat/completions HTTP/2.0" 504 569 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
websocket-monitor  | [2025-08-01 19:52:44] MEM: 395.9MiB / 31.35GiB | CPU: 0.17% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 2025/08/01 19:52:51 [info] 30#30: *504 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:52:51 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
websocket-monitor  | [2025-08-01 19:52:53] MEM: 395.9MiB / 31.35GiB | CPU: 0.21% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:53:02] MEM: 395.9MiB / 31.35GiB | CPU: 0.19% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:53:12 +0000] "GET /_app/version.json HTTP/2.0" 404 555 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:53:14 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 2025/08/01 19:53:14 [info] 30#30: *507 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai"
websocket-monitor  | [2025-08-01 19:53:11] MEM: 398.7MiB / 31.35GiB | CPU: 0.20% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:53:20] MEM: 398.7MiB / 31.35GiB | CPU: 0.19% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:53:30 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 101 839595 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 2025/08/01 19:53:30 [info] 30#30: *8 upstream timed out (110: Connection timed out) while proxying upgraded connection, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai"
websocket-monitor  | [2025-08-01 19:53:29] MEM: 398.7MiB / 31.35GiB | CPU: 0.19% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:53:37 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 2025/08/01 19:53:37 [info] 30#30: *510 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai"
websocket-monitor  | [2025-08-01 19:53:38] MEM: 398.5MiB / 31.35GiB | CPU: 0.16% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 2025/08/01 19:53:45 [error] 30#30: *1 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /api/v1/chats/?page=1 HTTP/2.0", upstream: "http://172.19.0.2:8080/api/v1/chats/?page=1", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:53:45 +0000] "GET /api/v1/chats/?page=1 HTTP/2.0" 504 569 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
websocket-monitor  | [2025-08-01 19:53:47] MEM: 398.5MiB / 31.35GiB | CPU: 0.18% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 2025/08/01 19:53:58 [info] 30#30: *1 client canceled stream 721 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /c/0ede08cd-ccd2-4992-b388-c24573c47f18 HTTP/2.0", upstream: "http://172.19.0.2:8080/c/0ede08cd-ccd2-4992-b388-c24573c47f18", host: "ai4l.pmoboost.ai"
openwebui-nginx    | 2025/08/01 19:53:58 [info] 30#30: *1 client canceled stream 717 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /api/models? HTTP/2.0", upstream: "http://172.19.0.2:8080/api/models?", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:53:58 +0000] "GET /c/0ede08cd-ccd2-4992-b388-c24573c47f18 HTTP/2.0" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:53:58 +0000] "GET /api/models? HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
websocket-monitor  | [2025-08-01 19:53:56] MEM: 398.5MiB / 31.35GiB | CPU: 0.21% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:54:02 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 2025/08/01 19:54:02 [info] 30#30: *513 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai"
websocket-monitor  | [2025-08-01 19:54:05] MEM: 398.5MiB / 31.35GiB | CPU: 10.10% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:54:10] WARNING: 9 CLOSE_WAIT connections found!
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:54:12 +0000] "GET /_app/version.json HTTP/2.0" 404 555 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
websocket-monitor  | [2025-08-01 19:54:15] MEM: 401.2MiB / 31.35GiB | CPU: 0.19% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:54:27 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 2025/08/01 19:54:27 [info] 30#30: *516 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai"
websocket-monitor  | [2025-08-01 19:54:24] MEM: 401.2MiB / 31.35GiB | CPU: 0.17% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:54:33] MEM: 401.2MiB / 31.35GiB | CPU: 0.23% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:54:39 +0000] "GET /c/0ede08cd-ccd2-4992-b388-c24573c47f18 HTTP/2.0" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 2025/08/01 19:54:39 [info] 30#30: *1 client canceled stream 723 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /c/0ede08cd-ccd2-4992-b388-c24573c47f18 HTTP/2.0", upstream: "http://172.19.0.2:8080/c/0ede08cd-ccd2-4992-b388-c24573c47f18", host: "ai4l.pmoboost.ai"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:54:39 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 2025/08/01 19:54:39 [info] 30#30: *518 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai"
websocket-monitor  | [2025-08-01 19:54:42] MEM: 401MiB / 31.35GiB | CPU: 0.19% | CONN: 10 | WS: 9 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:54:51] MEM: 401MiB / 31.35GiB | CPU: 0.19% | CONN: 10 | WS: 9 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:55:00] MEM: 401MiB / 31.35GiB | CPU: 0.18% | CONN: 10 | WS: 9 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:55:10 +0000] "GET /static/custom.css HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:55:10 +0000] "GET /static/splash.png HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:55:10 +0000] "GET /static/loader.js HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 2025/08/01 19:55:10 [info] 30#30: *1 client canceled stream 729 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/custom.css HTTP/2.0", upstream: "http://172.19.0.2:8080/static/custom.css", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/"
openwebui-nginx    | 2025/08/01 19:55:10 [info] 30#30: *1 client canceled stream 731 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/splash.png HTTP/2.0", upstream: "http://172.19.0.2:8080/static/splash.png", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/"
openwebui-nginx    | 2025/08/01 19:55:10 [info] 30#30: *1 client canceled stream 727 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/loader.js HTTP/2.0", upstream: "http://172.19.0.2:8080/static/loader.js", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/"
websocket-monitor  | [2025-08-01 19:55:09] MEM: 403.8MiB / 31.35GiB | CPU: 0.21% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:55:18] MEM: 403.7MiB / 31.35GiB | CPU: 0.20% | CONN: 12 | WS: 11 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:55:27 +0000] "GET /static/custom.css HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 2025/08/01 19:55:27 [info] 30#30: *1 client canceled stream 735 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/custom.css HTTP/2.0", upstream: "http://172.19.0.2:8080/static/custom.css", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:55:27 +0000] "GET /static/loader.js HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 98.97.38.1 - - [01/Aug/2025:19:55:27 +0000] "GET /static/splash.png HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-"
openwebui-nginx    | 2025/08/01 19:55:27 [info] 30#30: *1 client canceled stream 733 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/loader.js HTTP/2.0", upstream: "http://172.19.0.2:8080/static/loader.js", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/"
openwebui-nginx    | 2025/08/01 19:55:27 [info] 30#30: *1 client canceled stream 737 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/splash.png HTTP/2.0", upstream: "http://172.19.0.2:8080/static/splash.png", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/"
websocket-monitor  | [2025-08-01 19:55:27] MEM: 403.7MiB / 31.35GiB | CPU: 0.19% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:55:36] MEM: 403.7MiB / 31.35GiB | CPU: 0.21% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:55:45] MEM: 403.6MiB / 31.35GiB | CPU: 0.20% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:55:54] MEM: 403.6MiB / 31.35GiB | CPU: 0.19% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:56:03] MEM: 403.6MiB / 31.35GiB | CPU: 0.18% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:56:12] MEM: 406.3MiB / 31.35GiB | CPU: 0.20% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:56:21] MEM: 406.3MiB / 31.35GiB | CPU: 0.21% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:56:30] MEM: 406.3MiB / 31.35GiB | CPU: 0.19% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:56:39] MEM: 406.1MiB / 31.35GiB | CPU: 0.21% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:56:48] MEM: 406.1MiB / 31.35GiB | CPU: 0.18% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:56:57] MEM: 406.1MiB / 31.35GiB | CPU: 0.18% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:57:06] MEM: 406.1MiB / 31.35GiB | CPU: 0.20% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:57:15] MEM: 408.8MiB / 31.35GiB | CPU: 0.17% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:57:24] MEM: 408.8MiB / 31.35GiB | CPU: 0.19% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:57:33] MEM: 408.8MiB / 31.35GiB | CPU: 0.20% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:57:42] MEM: 408.6MiB / 31.35GiB | CPU: 0.18% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:57:51] MEM: 408.6MiB / 31.35GiB | CPU: 0.19% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:58:00] MEM: 408.6MiB / 31.35GiB | CPU: 0.19% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:58:09] MEM: 411.4MiB / 31.35GiB | CPU: 0.18% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:58:18] MEM: 411.4MiB / 31.35GiB | CPU: 0.19% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:58:27] MEM: 411.4MiB / 31.35GiB | CPU: 0.20% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 2025/08/01 19:58:35 [info] 30#30: *526 SSL_do_handshake() failed (SSL: error:0A000438:SSL routines::tlsv1 alert internal error:SSL alert number 80) while SSL handshaking, client: 57.141.0.29, server: 0.0.0.0:443
websocket-monitor  | [2025-08-01 19:58:36] MEM: 411.4MiB / 31.35GiB | CPU: 0.21% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:58:45] MEM: 411.2MiB / 31.35GiB | CPU: 0.20% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:58:54] MEM: 411.2MiB / 31.35GiB | CPU: 0.20% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:59:03] MEM: 411.2MiB / 31.35GiB | CPU: 0.20% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:59:13] MEM: 413.9MiB / 31.35GiB | CPU: 0.22% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:59:22] MEM: 413.9MiB / 31.35GiB | CPU: 0.20% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:59:31] MEM: 413.9MiB / 31.35GiB | CPU: 0.21% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:59:40] MEM: 413.7MiB / 31.35GiB | CPU: 0.19% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:59:49] MEM: 413.7MiB / 31.35GiB | CPU: 0.19% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 19:59:58] MEM: 413.7MiB / 31.35GiB | CPU: 0.18% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:00:07] MEM: 413.7MiB / 31.35GiB | CPU: 0.23% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:00:16] MEM: 416.5MiB / 31.35GiB | CPU: 0.21% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:00:25] MEM: 416.5MiB / 31.35GiB | CPU: 0.20% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:00:34] MEM: 416.5MiB / 31.35GiB | CPU: 0.21% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:00:43] MEM: 416.3MiB / 31.35GiB | CPU: 0.20% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:00:52] MEM: 416.3MiB / 31.35GiB | CPU: 0.20% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:01:01] MEM: 416.3MiB / 31.35GiB | CPU: 0.19% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:01:10] MEM: 419MiB / 31.35GiB | CPU: 0.19% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:01:19] MEM: 419MiB / 31.35GiB | CPU: 0.20% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:01:28] MEM: 419MiB / 31.35GiB | CPU: 0.19% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:01:37] MEM: 419MiB / 31.35GiB | CPU: 0.21% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:01:46] MEM: 418.8MiB / 31.35GiB | CPU: 0.21% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:01:55] MEM: 418.8MiB / 31.35GiB | CPU: 0.21% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:02:04] MEM: 418.8MiB / 31.35GiB | CPU: 0.20% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
openwebui-nginx    | 43.166.237.57 - - [01/Aug/2025:20:02:13 +0000] "GET / HTTP/1.1" 301 169 "-" "Mozilla/5.0 (iPhone; CPU iPhone OS 13_2_3 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.0.3 Mobile/15E148 Safari/604.1" "-"
openwebui-nginx    | 43.166.237.57 - - [01/Aug/2025:20:02:15 +0000] "GET / HTTP/1.1" 444 0 "http://137.117.14.214" "Mozilla/5.0 (iPhone; CPU iPhone OS 13_2_3 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.0.3 Mobile/15E148 Safari/604.1" "-"
websocket-monitor  | [2025-08-01 20:02:13] MEM: 421.6MiB / 31.35GiB | CPU: 0.21% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:02:22] MEM: 421.5MiB / 31.35GiB | CPU: 0.21% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:02:31] MEM: 421.5MiB / 31.35GiB | CPU: 0.21% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:02:40] MEM: 421.4MiB / 31.35GiB | CPU: 0.19% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:02:49] MEM: 421.4MiB / 31.35GiB | CPU: 0.22% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:02:58] MEM: 421.4MiB / 31.35GiB | CPU: 0.19% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:03:07] MEM: 421.4MiB / 31.35GiB | CPU: 0.22% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:03:16] MEM: 424.1MiB / 31.35GiB | CPU: 0.23% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:03:20] WARNING: 18 CLOSE_WAIT connections found!
websocket-monitor  | [2025-08-01 20:03:26] MEM: 424.1MiB / 31.35GiB | CPU: 0.21% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:03:35] MEM: 424.1MiB / 31.35GiB | CPU: 0.21% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:03:45] MEM: 423.9MiB / 31.35GiB | CPU: 0.22% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:03:54] MEM: 423.9MiB / 31.35GiB | CPU: 0.23% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:04:04] MEM: 423.9MiB / 31.35GiB | CPU: 0.21% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:04:13] MEM: 426.6MiB / 31.35GiB | CPU: 0.20% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:04:22] MEM: 426.6MiB / 31.35GiB | CPU: 0.25% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:04:31] MEM: 426.6MiB / 31.35GiB | CPU: 0.22% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:04:40] MEM: 426.4MiB / 31.35GiB | CPU: 0.22% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:04:49] MEM: 426.4MiB / 31.35GiB | CPU: 0.23% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:04:58] MEM: 426.4MiB / 31.35GiB | CPU: 0.22% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:05:07] MEM: 426.4MiB / 31.35GiB | CPU: 0.24% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:05:16] MEM: 429.2MiB / 31.35GiB | CPU: 0.22% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:05:25] MEM: 429.2MiB / 31.35GiB | CPU: 0.21% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:05:34] MEM: 429.2MiB / 31.35GiB | CPU: 0.22% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:05:43] MEM: 429MiB / 31.35GiB | CPU: 0.21% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:05:52] MEM: 429MiB / 31.35GiB | CPU: 0.24% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:06:01] MEM: 429MiB / 31.35GiB | CPU: 0.22% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:06:10] MEM: 432.1MiB / 31.35GiB | CPU: 0.22% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:06:19] MEM: 431.7MiB / 31.35GiB | CPU: 0.23% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:06:28] MEM: 431.7MiB / 31.35GiB | CPU: 0.23% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:06:37] MEM: 431.7MiB / 31.35GiB | CPU: 0.24% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:06:48] MEM: 431.5MiB / 31.35GiB | CPU: 0.20% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:06:57] MEM: 431.5MiB / 31.35GiB | CPU: 0.23% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:07:06] MEM: 431.5MiB / 31.35GiB | CPU: 0.23% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:07:15] MEM: 434.3MiB / 31.35GiB | CPU: 0.22% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:07:24] MEM: 434.3MiB / 31.35GiB | CPU: 0.25% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:07:33] MEM: 434.3MiB / 31.35GiB | CPU: 0.23% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:07:42] MEM: 434.1MiB / 31.35GiB | CPU: 0.23% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:07:51] MEM: 434.1MiB / 31.35GiB | CPU: 0.24% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:08:00] MEM: 434.1MiB / 31.35GiB | CPU: 0.24% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:08:09] MEM: 434.1MiB / 31.35GiB | CPU: 0.24% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:08:18] MEM: 436.8MiB / 31.35GiB | CPU: 0.24% | CONN: 35 | WS: 34 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
websocket-monitor  | [2025-08-01 20:08:27] MEM: 436.9MiB / 31.35GiB | CPU: 0.25% | CONN: 35 | WS: 34 | PROC: 1 | FD: 36 | SIO: 0
websocket-monitor  | 0
ai4l-openwebui     | 2025-08-01 20:08:37.954 | DEBUG    | open_webui.utils.middleware:process_chat_payload:902 - tool_ids=None - {}
ai4l-openwebui     | 2025-08-01 20:08:37.955 | DEBUG    | open_webui.utils.middleware:process_chat_payload:903 - tool_servers=[] - {}
ai4l-openwebui     | 2025-08-01 20:08:37.955 | DEBUG    | open_webui.routers.tasks:generate_queries:518 - generating retrieval queries using model 4l---customer-intellegence for user bruce.angelis@boostgroup.llc - {}
ai4l-openwebui     | 2025-08-01 20:08:37.956 | DEBUG    | open_webui.utils.chat:generate_chat_completion:167 - generate_chat_completion: {'model': '4l---customer-intellegence', 'messages': [{'role': 'user', 'content': '### Task:\nAnalyze the chat history to determine the necessity of generating search queries, in the given language. By default, **prioritize generating 1-3 broad and relevant search queries** unless it is absolutely certain that no additional information is r
<snip>
<!-- gh-comment-id:3145840532 --> @BAngelis commented on GitHub (Aug 1, 2025): @tjbck thanks for looking at my issue. I added a AIOHTTP_CLIENT_TIMEOUT=30 to the environment (for a fail fast), but there is no change. Openweb UI doesnt log anything new (after the open_webui.retrieval.utils:generate_openai_batch_embeddings call) for 10+ minutes. It hangs up all other connected clients, nobody can log out or in. Then suddenly, after 17 minutes in this case, it starts responding to new requests. I've been struggling with this issue for weeks. Where else can I get some help?! Here's the logs after the AIOHTTP_CLIENT_TIMEOUT change.. ``` <snip> , 'actions': [], 'filters': [], 'tags': []}, 'direct': False}} - {} ai4l-openwebui | 2025-08-01 19:51:45.310 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 1 - {} websocket-monitor | [2025-08-01 19:51:41] MEM: 387.8MiB / 31.35GiB | CPU: 0.31% | CONN: 2 | WS: 1 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:51:50] MEM: 393.4MiB / 31.35GiB | CPU: 0.20% | CONN: 3 | WS: 2 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:51:59] MEM: 393.4MiB / 31.35GiB | CPU: 0.17% | CONN: 3 | WS: 2 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:52:08] MEM: 396.1MiB / 31.35GiB | CPU: 0.18% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:52:12 +0000] "GET /_app/version.json HTTP/2.0" 404 555 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" websocket-monitor | [2025-08-01 19:52:17] MEM: 396.1MiB / 31.35GiB | CPU: 0.20% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:52:26] MEM: 396.1MiB / 31.35GiB | CPU: 0.18% | CONN: 5 | WS: 4 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:52:35] MEM: 396.1MiB / 31.35GiB | CPU: 0.21% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 2025/08/01 19:52:45 [error] 30#30: *1 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "POST /api/chat/completions HTTP/2.0", upstream: "http://172.19.0.2:8080/api/chat/completions", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:52:45 +0000] "POST /api/chat/completions HTTP/2.0" 504 569 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" websocket-monitor | [2025-08-01 19:52:44] MEM: 395.9MiB / 31.35GiB | CPU: 0.17% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 2025/08/01 19:52:51 [info] 30#30: *504 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:52:51 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" websocket-monitor | [2025-08-01 19:52:53] MEM: 395.9MiB / 31.35GiB | CPU: 0.21% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:53:02] MEM: 395.9MiB / 31.35GiB | CPU: 0.19% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:53:12 +0000] "GET /_app/version.json HTTP/2.0" 404 555 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:53:14 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 2025/08/01 19:53:14 [info] 30#30: *507 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai" websocket-monitor | [2025-08-01 19:53:11] MEM: 398.7MiB / 31.35GiB | CPU: 0.20% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:53:20] MEM: 398.7MiB / 31.35GiB | CPU: 0.19% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:53:30 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 101 839595 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 2025/08/01 19:53:30 [info] 30#30: *8 upstream timed out (110: Connection timed out) while proxying upgraded connection, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai" websocket-monitor | [2025-08-01 19:53:29] MEM: 398.7MiB / 31.35GiB | CPU: 0.19% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:53:37 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 2025/08/01 19:53:37 [info] 30#30: *510 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai" websocket-monitor | [2025-08-01 19:53:38] MEM: 398.5MiB / 31.35GiB | CPU: 0.16% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 2025/08/01 19:53:45 [error] 30#30: *1 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /api/v1/chats/?page=1 HTTP/2.0", upstream: "http://172.19.0.2:8080/api/v1/chats/?page=1", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:53:45 +0000] "GET /api/v1/chats/?page=1 HTTP/2.0" 504 569 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" websocket-monitor | [2025-08-01 19:53:47] MEM: 398.5MiB / 31.35GiB | CPU: 0.18% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 2025/08/01 19:53:58 [info] 30#30: *1 client canceled stream 721 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /c/0ede08cd-ccd2-4992-b388-c24573c47f18 HTTP/2.0", upstream: "http://172.19.0.2:8080/c/0ede08cd-ccd2-4992-b388-c24573c47f18", host: "ai4l.pmoboost.ai" openwebui-nginx | 2025/08/01 19:53:58 [info] 30#30: *1 client canceled stream 717 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /api/models? HTTP/2.0", upstream: "http://172.19.0.2:8080/api/models?", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:53:58 +0000] "GET /c/0ede08cd-ccd2-4992-b388-c24573c47f18 HTTP/2.0" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:53:58 +0000] "GET /api/models? HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" websocket-monitor | [2025-08-01 19:53:56] MEM: 398.5MiB / 31.35GiB | CPU: 0.21% | CONN: 6 | WS: 5 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:54:02 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 2025/08/01 19:54:02 [info] 30#30: *513 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai" websocket-monitor | [2025-08-01 19:54:05] MEM: 398.5MiB / 31.35GiB | CPU: 10.10% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:54:10] WARNING: 9 CLOSE_WAIT connections found! openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:54:12 +0000] "GET /_app/version.json HTTP/2.0" 404 555 "https://ai4l.pmoboost.ai/c/0ede08cd-ccd2-4992-b388-c24573c47f18" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" websocket-monitor | [2025-08-01 19:54:15] MEM: 401.2MiB / 31.35GiB | CPU: 0.19% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:54:27 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 2025/08/01 19:54:27 [info] 30#30: *516 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai" websocket-monitor | [2025-08-01 19:54:24] MEM: 401.2MiB / 31.35GiB | CPU: 0.17% | CONN: 8 | WS: 7 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:54:33] MEM: 401.2MiB / 31.35GiB | CPU: 0.23% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:54:39 +0000] "GET /c/0ede08cd-ccd2-4992-b388-c24573c47f18 HTTP/2.0" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 2025/08/01 19:54:39 [info] 30#30: *1 client canceled stream 723 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /c/0ede08cd-ccd2-4992-b388-c24573c47f18 HTTP/2.0", upstream: "http://172.19.0.2:8080/c/0ede08cd-ccd2-4992-b388-c24573c47f18", host: "ai4l.pmoboost.ai" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:54:39 +0000] "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 2025/08/01 19:54:39 [info] 30#30: *518 epoll_wait() reported that client prematurely closed connection, so upstream connection is closed too while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /ws/socket.io/?EIO=4&transport=websocket HTTP/1.1", upstream: "http://172.19.0.2:8080/ws/socket.io/?EIO=4&transport=websocket", host: "ai4l.pmoboost.ai" websocket-monitor | [2025-08-01 19:54:42] MEM: 401MiB / 31.35GiB | CPU: 0.19% | CONN: 10 | WS: 9 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:54:51] MEM: 401MiB / 31.35GiB | CPU: 0.19% | CONN: 10 | WS: 9 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:55:00] MEM: 401MiB / 31.35GiB | CPU: 0.18% | CONN: 10 | WS: 9 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:55:10 +0000] "GET /static/custom.css HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:55:10 +0000] "GET /static/splash.png HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:55:10 +0000] "GET /static/loader.js HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 2025/08/01 19:55:10 [info] 30#30: *1 client canceled stream 729 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/custom.css HTTP/2.0", upstream: "http://172.19.0.2:8080/static/custom.css", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/" openwebui-nginx | 2025/08/01 19:55:10 [info] 30#30: *1 client canceled stream 731 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/splash.png HTTP/2.0", upstream: "http://172.19.0.2:8080/static/splash.png", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/" openwebui-nginx | 2025/08/01 19:55:10 [info] 30#30: *1 client canceled stream 727 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/loader.js HTTP/2.0", upstream: "http://172.19.0.2:8080/static/loader.js", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/" websocket-monitor | [2025-08-01 19:55:09] MEM: 403.8MiB / 31.35GiB | CPU: 0.21% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:55:18] MEM: 403.7MiB / 31.35GiB | CPU: 0.20% | CONN: 12 | WS: 11 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:55:27 +0000] "GET /static/custom.css HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 2025/08/01 19:55:27 [info] 30#30: *1 client canceled stream 735 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/custom.css HTTP/2.0", upstream: "http://172.19.0.2:8080/static/custom.css", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:55:27 +0000] "GET /static/loader.js HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 98.97.38.1 - - [01/Aug/2025:19:55:27 +0000] "GET /static/splash.png HTTP/2.0" 499 0 "https://ai4l.pmoboost.ai/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36" "-" openwebui-nginx | 2025/08/01 19:55:27 [info] 30#30: *1 client canceled stream 733 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/loader.js HTTP/2.0", upstream: "http://172.19.0.2:8080/static/loader.js", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/" openwebui-nginx | 2025/08/01 19:55:27 [info] 30#30: *1 client canceled stream 737 while sending request to upstream, client: 98.97.38.1, server: ai4l.pmoboost.ai, request: "GET /static/splash.png HTTP/2.0", upstream: "http://172.19.0.2:8080/static/splash.png", host: "ai4l.pmoboost.ai", referrer: "https://ai4l.pmoboost.ai/" websocket-monitor | [2025-08-01 19:55:27] MEM: 403.7MiB / 31.35GiB | CPU: 0.19% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:55:36] MEM: 403.7MiB / 31.35GiB | CPU: 0.21% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:55:45] MEM: 403.6MiB / 31.35GiB | CPU: 0.20% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:55:54] MEM: 403.6MiB / 31.35GiB | CPU: 0.19% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:56:03] MEM: 403.6MiB / 31.35GiB | CPU: 0.18% | CONN: 9 | WS: 8 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:56:12] MEM: 406.3MiB / 31.35GiB | CPU: 0.20% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:56:21] MEM: 406.3MiB / 31.35GiB | CPU: 0.21% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:56:30] MEM: 406.3MiB / 31.35GiB | CPU: 0.19% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:56:39] MEM: 406.1MiB / 31.35GiB | CPU: 0.21% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:56:48] MEM: 406.1MiB / 31.35GiB | CPU: 0.18% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:56:57] MEM: 406.1MiB / 31.35GiB | CPU: 0.18% | CONN: 11 | WS: 10 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:57:06] MEM: 406.1MiB / 31.35GiB | CPU: 0.20% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:57:15] MEM: 408.8MiB / 31.35GiB | CPU: 0.17% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:57:24] MEM: 408.8MiB / 31.35GiB | CPU: 0.19% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:57:33] MEM: 408.8MiB / 31.35GiB | CPU: 0.20% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:57:42] MEM: 408.6MiB / 31.35GiB | CPU: 0.18% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:57:51] MEM: 408.6MiB / 31.35GiB | CPU: 0.19% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:58:00] MEM: 408.6MiB / 31.35GiB | CPU: 0.19% | CONN: 13 | WS: 12 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:58:09] MEM: 411.4MiB / 31.35GiB | CPU: 0.18% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:58:18] MEM: 411.4MiB / 31.35GiB | CPU: 0.19% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:58:27] MEM: 411.4MiB / 31.35GiB | CPU: 0.20% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 2025/08/01 19:58:35 [info] 30#30: *526 SSL_do_handshake() failed (SSL: error:0A000438:SSL routines::tlsv1 alert internal error:SSL alert number 80) while SSL handshaking, client: 57.141.0.29, server: 0.0.0.0:443 websocket-monitor | [2025-08-01 19:58:36] MEM: 411.4MiB / 31.35GiB | CPU: 0.21% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:58:45] MEM: 411.2MiB / 31.35GiB | CPU: 0.20% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:58:54] MEM: 411.2MiB / 31.35GiB | CPU: 0.20% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:59:03] MEM: 411.2MiB / 31.35GiB | CPU: 0.20% | CONN: 15 | WS: 14 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:59:13] MEM: 413.9MiB / 31.35GiB | CPU: 0.22% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:59:22] MEM: 413.9MiB / 31.35GiB | CPU: 0.20% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:59:31] MEM: 413.9MiB / 31.35GiB | CPU: 0.21% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:59:40] MEM: 413.7MiB / 31.35GiB | CPU: 0.19% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:59:49] MEM: 413.7MiB / 31.35GiB | CPU: 0.19% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 19:59:58] MEM: 413.7MiB / 31.35GiB | CPU: 0.18% | CONN: 17 | WS: 16 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:00:07] MEM: 413.7MiB / 31.35GiB | CPU: 0.23% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:00:16] MEM: 416.5MiB / 31.35GiB | CPU: 0.21% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:00:25] MEM: 416.5MiB / 31.35GiB | CPU: 0.20% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:00:34] MEM: 416.5MiB / 31.35GiB | CPU: 0.21% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:00:43] MEM: 416.3MiB / 31.35GiB | CPU: 0.20% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:00:52] MEM: 416.3MiB / 31.35GiB | CPU: 0.20% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:01:01] MEM: 416.3MiB / 31.35GiB | CPU: 0.19% | CONN: 19 | WS: 18 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:01:10] MEM: 419MiB / 31.35GiB | CPU: 0.19% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:01:19] MEM: 419MiB / 31.35GiB | CPU: 0.20% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:01:28] MEM: 419MiB / 31.35GiB | CPU: 0.19% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:01:37] MEM: 419MiB / 31.35GiB | CPU: 0.21% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:01:46] MEM: 418.8MiB / 31.35GiB | CPU: 0.21% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:01:55] MEM: 418.8MiB / 31.35GiB | CPU: 0.21% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:02:04] MEM: 418.8MiB / 31.35GiB | CPU: 0.20% | CONN: 21 | WS: 20 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 openwebui-nginx | 43.166.237.57 - - [01/Aug/2025:20:02:13 +0000] "GET / HTTP/1.1" 301 169 "-" "Mozilla/5.0 (iPhone; CPU iPhone OS 13_2_3 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.0.3 Mobile/15E148 Safari/604.1" "-" openwebui-nginx | 43.166.237.57 - - [01/Aug/2025:20:02:15 +0000] "GET / HTTP/1.1" 444 0 "http://137.117.14.214" "Mozilla/5.0 (iPhone; CPU iPhone OS 13_2_3 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.0.3 Mobile/15E148 Safari/604.1" "-" websocket-monitor | [2025-08-01 20:02:13] MEM: 421.6MiB / 31.35GiB | CPU: 0.21% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:02:22] MEM: 421.5MiB / 31.35GiB | CPU: 0.21% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:02:31] MEM: 421.5MiB / 31.35GiB | CPU: 0.21% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:02:40] MEM: 421.4MiB / 31.35GiB | CPU: 0.19% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:02:49] MEM: 421.4MiB / 31.35GiB | CPU: 0.22% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:02:58] MEM: 421.4MiB / 31.35GiB | CPU: 0.19% | CONN: 23 | WS: 22 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:03:07] MEM: 421.4MiB / 31.35GiB | CPU: 0.22% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:03:16] MEM: 424.1MiB / 31.35GiB | CPU: 0.23% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:03:20] WARNING: 18 CLOSE_WAIT connections found! websocket-monitor | [2025-08-01 20:03:26] MEM: 424.1MiB / 31.35GiB | CPU: 0.21% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:03:35] MEM: 424.1MiB / 31.35GiB | CPU: 0.21% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:03:45] MEM: 423.9MiB / 31.35GiB | CPU: 0.22% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:03:54] MEM: 423.9MiB / 31.35GiB | CPU: 0.23% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:04:04] MEM: 423.9MiB / 31.35GiB | CPU: 0.21% | CONN: 25 | WS: 24 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:04:13] MEM: 426.6MiB / 31.35GiB | CPU: 0.20% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:04:22] MEM: 426.6MiB / 31.35GiB | CPU: 0.25% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:04:31] MEM: 426.6MiB / 31.35GiB | CPU: 0.22% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:04:40] MEM: 426.4MiB / 31.35GiB | CPU: 0.22% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:04:49] MEM: 426.4MiB / 31.35GiB | CPU: 0.23% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:04:58] MEM: 426.4MiB / 31.35GiB | CPU: 0.22% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:05:07] MEM: 426.4MiB / 31.35GiB | CPU: 0.24% | CONN: 27 | WS: 26 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:05:16] MEM: 429.2MiB / 31.35GiB | CPU: 0.22% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:05:25] MEM: 429.2MiB / 31.35GiB | CPU: 0.21% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:05:34] MEM: 429.2MiB / 31.35GiB | CPU: 0.22% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:05:43] MEM: 429MiB / 31.35GiB | CPU: 0.21% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:05:52] MEM: 429MiB / 31.35GiB | CPU: 0.24% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:06:01] MEM: 429MiB / 31.35GiB | CPU: 0.22% | CONN: 29 | WS: 28 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:06:10] MEM: 432.1MiB / 31.35GiB | CPU: 0.22% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:06:19] MEM: 431.7MiB / 31.35GiB | CPU: 0.23% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:06:28] MEM: 431.7MiB / 31.35GiB | CPU: 0.23% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:06:37] MEM: 431.7MiB / 31.35GiB | CPU: 0.24% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:06:48] MEM: 431.5MiB / 31.35GiB | CPU: 0.20% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:06:57] MEM: 431.5MiB / 31.35GiB | CPU: 0.23% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:07:06] MEM: 431.5MiB / 31.35GiB | CPU: 0.23% | CONN: 31 | WS: 30 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:07:15] MEM: 434.3MiB / 31.35GiB | CPU: 0.22% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:07:24] MEM: 434.3MiB / 31.35GiB | CPU: 0.25% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:07:33] MEM: 434.3MiB / 31.35GiB | CPU: 0.23% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:07:42] MEM: 434.1MiB / 31.35GiB | CPU: 0.23% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:07:51] MEM: 434.1MiB / 31.35GiB | CPU: 0.24% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:08:00] MEM: 434.1MiB / 31.35GiB | CPU: 0.24% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:08:09] MEM: 434.1MiB / 31.35GiB | CPU: 0.24% | CONN: 33 | WS: 32 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:08:18] MEM: 436.8MiB / 31.35GiB | CPU: 0.24% | CONN: 35 | WS: 34 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 websocket-monitor | [2025-08-01 20:08:27] MEM: 436.9MiB / 31.35GiB | CPU: 0.25% | CONN: 35 | WS: 34 | PROC: 1 | FD: 36 | SIO: 0 websocket-monitor | 0 ai4l-openwebui | 2025-08-01 20:08:37.954 | DEBUG | open_webui.utils.middleware:process_chat_payload:902 - tool_ids=None - {} ai4l-openwebui | 2025-08-01 20:08:37.955 | DEBUG | open_webui.utils.middleware:process_chat_payload:903 - tool_servers=[] - {} ai4l-openwebui | 2025-08-01 20:08:37.955 | DEBUG | open_webui.routers.tasks:generate_queries:518 - generating retrieval queries using model 4l---customer-intellegence for user bruce.angelis@boostgroup.llc - {} ai4l-openwebui | 2025-08-01 20:08:37.956 | DEBUG | open_webui.utils.chat:generate_chat_completion:167 - generate_chat_completion: {'model': '4l---customer-intellegence', 'messages': [{'role': 'user', 'content': '### Task:\nAnalyze the chat history to determine the necessity of generating search queries, in the given language. By default, **prioritize generating 1-3 broad and relevant search queries** unless it is absolutely certain that no additional information is r <snip> ```
Author
Owner

@BAngelis commented on GitHub (Aug 1, 2025):

All I'm doing is chatting w/ a model, with a knowledge base in Pinecone. I'm not uploading files, or anything complicated (yet).

<!-- gh-comment-id:3145888297 --> @BAngelis commented on GitHub (Aug 1, 2025): All I'm doing is chatting w/ a model, with a knowledge base in Pinecone. I'm not uploading files, or anything complicated (yet).
Author
Owner

@rgaricano commented on GitHub (Aug 1, 2025):

Bruce, could you explain the whole setup & services that you are using?
(openwebui (install method), ollama or other llm provider, web proxy, database engine, embedd engine,...)

What it's your setup for pinecone (local or remote, starter or paid,..)...
I read about some timeouts error similar to yours...
https://community.pinecone.io/t/request-timeout/3750/2
https://community.n8n.io/t/pinecone-vector-store-timeout-with-large-whatsapp-chats-need-optimization-help/148202/4
https://community.pinecone.io/t/who-else-is-facing-504-gateway-timeouts-urgent/4064
...

I would try increasing the nginx timeouts, and since it seems most likely that the timeouts are coming from Pinecone, I would try another database engine to see if the same errors occur. That would help narrow down the problem.

<!-- gh-comment-id:3145931846 --> @rgaricano commented on GitHub (Aug 1, 2025): Bruce, could you explain the whole setup & services that you are using? (openwebui (install method), ollama or other llm provider, web proxy, database engine, embedd engine,...) What it's your setup for pinecone (local or remote, starter or paid,..)... I read about some timeouts error similar to yours... https://community.pinecone.io/t/request-timeout/3750/2 https://community.n8n.io/t/pinecone-vector-store-timeout-with-large-whatsapp-chats-need-optimization-help/148202/4 https://community.pinecone.io/t/who-else-is-facing-504-gateway-timeouts-urgent/4064 ... I would try increasing the nginx timeouts, and since it seems most likely that the timeouts are coming from Pinecone, I would try another database engine to see if the same errors occur. That would help narrow down the problem.
Author
Owner

@BAngelis commented on GitHub (Aug 8, 2025):

Hello @rgaricano,

Thanks for your response.

My setup here is

  1. Running Openweb UI in docker
  2. On an Azure VM
  3. Using nginx as a reverse proxy
  4. Using Pinecone as the Vector Store on paid Standard Plan
  5. LLM is via OpenAI, on paid Pro plan
  6. Embedding model is text-embedding-3-small
  7. Database engine is whatever is the default

It seems to me that it stops performing normally between these two calls:

ai4l-openwebui | 2025-07-29T23:26:56.671475035Z 2025-07-29 23:26:56.671 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 3 - {}

ai4l-openwebui | 2025-07-29T23:26:56.993248363Z 2025-07-29 23:26:56.992 | DEBUG | open_webui.retrieval.utils:query_collection:299 - query_collection: processing 3 queries across 1 collections - {}

Because, everytime it blocks/hangs, the last DEBUG log entry is always "open_webui.retrieval.utils:generate_openai_batch_embeddings"

So adding the AIOHTTP_CLIENT_TIMEOUT seemed to make sense, but didn't change the behavior.

I'm experimenting with the nginx timeouts now, but not seeing any improvements yet.

<!-- gh-comment-id:3168857803 --> @BAngelis commented on GitHub (Aug 8, 2025): Hello @rgaricano, Thanks for your response. My setup here is 1) Running Openweb UI in docker 2) On an Azure VM 3) Using nginx as a reverse proxy 4) Using Pinecone as the Vector Store on paid Standard Plan 5) LLM is via OpenAI, on paid Pro plan 6) Embedding model is text-embedding-3-small 7) Database engine is whatever is the default It seems to me that it stops performing normally between these two calls: ai4l-openwebui | 2025-07-29T23:26:56.671475035Z 2025-07-29 23:26:56.671 | DEBUG | **open_webui.retrieval.utils:generate_openai_batch_embeddings**:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 3 - {} ai4l-openwebui | 2025-07-29T23:26:56.993248363Z 2025-07-29 23:26:56.992 | DEBUG | **open_webui.retrieval.utils:query_collection**:299 - query_collection: processing 3 queries across 1 collections - {} Because, everytime it blocks/hangs, the last DEBUG log entry is always "open_webui.retrieval.utils:generate_openai_batch_embeddings" So adding the AIOHTTP_CLIENT_TIMEOUT seemed to make sense, but didn't change the behavior. I'm experimenting with the nginx timeouts now, but not seeing any improvements yet.
Author
Owner

@rgaricano commented on GitHub (Aug 8, 2025):

Not using redis for websockets?

<!-- gh-comment-id:3168973902 --> @rgaricano commented on GitHub (Aug 8, 2025): Not using redis for websockets?
Author
Owner

@BAngelis commented on GitHub (Aug 8, 2025):

@rgaricano , No I'm not using redis, but happy to take any suggestions since this is totally blocking our progress.

After tweaking the timeouts i'm still able to get it to "hang" for 13 minutes between the embedding and query_collection calls. During that time, the clients are not able to stop their chats, log out, or nearly anything from the UI.

Here's a screen shot of the logs.

Image
<!-- gh-comment-id:3169139600 --> @BAngelis commented on GitHub (Aug 8, 2025): @rgaricano , No I'm not using redis, but happy to take any suggestions since this is totally blocking our progress. After tweaking the timeouts i'm still able to get it to "hang" for 13 minutes between the embedding and query_collection calls. During that time, the clients are not able to stop their chats, log out, or nearly anything from the UI. Here's a screen shot of the logs. <img width="2213" height="1403" alt="Image" src="https://github.com/user-attachments/assets/2bf8fcc3-46cd-4645-831d-c7c4a3f541cf" />
Author
Owner

@BAngelis commented on GitHub (Aug 8, 2025):

@rgaricano, here is my new ngin.conf with better timeout settings.

user  nginx;
worker_processes auto;

error_log  /var/log/nginx/error.log warn;
pid        /var/run/nginx.pid;

events {
  worker_connections 4096;
  multi_accept on;
}

http {
  include       /etc/nginx/mime.types;
  default_type  application/octet-stream;

  log_format main '$remote_addr - $remote_user [$time_local] "$request" '
                  '$status $body_bytes_sent "$http_referer" '
                  '"$http_user_agent" "$http_x_forwarded_for"';
  access_log /var/log/nginx/access.log main;

  # HTTP tuning
  sendfile on;
  tcp_nopush on;
  tcp_nodelay on;
  keepalive_timeout 65s;
  keepalive_requests 1000;
  server_tokens off;

  client_max_body_size 50m;   # supports your RAG uploads
  client_body_timeout 60s;
  send_timeout 120s;

  # gzip (avoid text/event-stream to keep SSE snappy)
  gzip on;
  gzip_comp_level 5;
  gzip_min_length 1024;
  gzip_types text/plain text/css application/json application/javascript application/xml image/svg+xml;

  # Upstream: use Docker service name on shared network
  upstream openwebui_upstream {
    server openwebui:8080;
    keepalive 32;
  }

  # WebSocket upgrade helper
  map $http_upgrade $connection_upgrade {
    default upgrade;
    ''      close;
  }

  # --- HTTP -> HTTPS redirect ---
  server {
    listen 80 default_server;
    listen [::]:80 default_server;
    server_name _;
    return 301 https://$host$request_uri;
  }

  # --- Primary HTTPS vhost for OpenWebUI ---
  server {
    listen 443 ssl http2;
    listen [::]:443 ssl http2;
    server_name ai4l.pmoboost.ai;

    # Let’s Encrypt certs (mounted from host)
    ssl_protocols TLSv1.2 TLSv1.3;
    ssl_certificate     /etc/letsencrypt/live/ai4l.pmoboost.ai/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/ai4l.pmoboost.ai/privkey.pem;

    # Sane TLS (no legacy cipher pinning)
    ssl_session_cache   shared:SSL:50m;
    ssl_session_timeout 1d;
    ssl_session_tickets off;

    # OCSP stapling (recommended)
    ssl_stapling on;
    ssl_stapling_verify on;
    resolver 1.1.1.1 1.0.0.1 valid=300s ipv6=off;
    resolver_timeout 5s;

    # Security headers
    add_header Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" always;
    add_header X-Content-Type-Options "nosniff" always;
    add_header Referrer-Policy "strict-origin-when-cross-origin" always;
    add_header Permissions-Policy "geolocation=(), microphone=(), camera=()" always;
    # Minimal CSP that plays nice with OpenWebUI; tighten later if desired
    add_header Content-Security-Policy "default-src 'self' 'unsafe-inline' data: blob:; connect-src 'self' https: ws: wss:; img-src 'self' data: blob: https:; media-src 'self' data: blob: https:;" always;

    # Proxy defaults (streaming-friendly)
    proxy_http_version 1.1;
    proxy_set_header Host              $host;
    proxy_set_header X-Real-IP         $remote_addr;
    proxy_set_header X-Forwarded-For   $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    proxy_set_header Connection        "";

    proxy_connect_timeout 10s;
    proxy_send_timeout    120s;
    proxy_read_timeout    900s;   # don’t cut off long/quiet token streams
    send_timeout          120s;

    # Main app — CRITICAL: disable buffering for SSE/token streaming
    location / {
      proxy_pass http://openwebui_upstream;
      proxy_buffering off;
      proxy_request_buffering off;
    }

    # WebSockets (if OWUI uses them)
    location /ws/ {
      proxy_pass http://openwebui_upstream;
      proxy_http_version 1.1;
      proxy_set_header Upgrade $http_upgrade;
      proxy_set_header Connection $connection_upgrade;
      proxy_read_timeout 900s;
      proxy_buffering off;
    }
  }
}
<!-- gh-comment-id:3169156207 --> @BAngelis commented on GitHub (Aug 8, 2025): @rgaricano, here is my new ngin.conf with better timeout settings. ``` user nginx; worker_processes auto; error_log /var/log/nginx/error.log warn; pid /var/run/nginx.pid; events { worker_connections 4096; multi_accept on; } http { include /etc/nginx/mime.types; default_type application/octet-stream; log_format main '$remote_addr - $remote_user [$time_local] "$request" ' '$status $body_bytes_sent "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; access_log /var/log/nginx/access.log main; # HTTP tuning sendfile on; tcp_nopush on; tcp_nodelay on; keepalive_timeout 65s; keepalive_requests 1000; server_tokens off; client_max_body_size 50m; # supports your RAG uploads client_body_timeout 60s; send_timeout 120s; # gzip (avoid text/event-stream to keep SSE snappy) gzip on; gzip_comp_level 5; gzip_min_length 1024; gzip_types text/plain text/css application/json application/javascript application/xml image/svg+xml; # Upstream: use Docker service name on shared network upstream openwebui_upstream { server openwebui:8080; keepalive 32; } # WebSocket upgrade helper map $http_upgrade $connection_upgrade { default upgrade; '' close; } # --- HTTP -> HTTPS redirect --- server { listen 80 default_server; listen [::]:80 default_server; server_name _; return 301 https://$host$request_uri; } # --- Primary HTTPS vhost for OpenWebUI --- server { listen 443 ssl http2; listen [::]:443 ssl http2; server_name ai4l.pmoboost.ai; # Let’s Encrypt certs (mounted from host) ssl_protocols TLSv1.2 TLSv1.3; ssl_certificate /etc/letsencrypt/live/ai4l.pmoboost.ai/fullchain.pem; ssl_certificate_key /etc/letsencrypt/live/ai4l.pmoboost.ai/privkey.pem; # Sane TLS (no legacy cipher pinning) ssl_session_cache shared:SSL:50m; ssl_session_timeout 1d; ssl_session_tickets off; # OCSP stapling (recommended) ssl_stapling on; ssl_stapling_verify on; resolver 1.1.1.1 1.0.0.1 valid=300s ipv6=off; resolver_timeout 5s; # Security headers add_header Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" always; add_header X-Content-Type-Options "nosniff" always; add_header Referrer-Policy "strict-origin-when-cross-origin" always; add_header Permissions-Policy "geolocation=(), microphone=(), camera=()" always; # Minimal CSP that plays nice with OpenWebUI; tighten later if desired add_header Content-Security-Policy "default-src 'self' 'unsafe-inline' data: blob:; connect-src 'self' https: ws: wss:; img-src 'self' data: blob: https:; media-src 'self' data: blob: https:;" always; # Proxy defaults (streaming-friendly) proxy_http_version 1.1; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header Connection ""; proxy_connect_timeout 10s; proxy_send_timeout 120s; proxy_read_timeout 900s; # don’t cut off long/quiet token streams send_timeout 120s; # Main app — CRITICAL: disable buffering for SSE/token streaming location / { proxy_pass http://openwebui_upstream; proxy_buffering off; proxy_request_buffering off; } # WebSockets (if OWUI uses them) location /ws/ { proxy_pass http://openwebui_upstream; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection $connection_upgrade; proxy_read_timeout 900s; proxy_buffering off; } } } ```
Author
Owner

@rgaricano commented on GitHub (Aug 8, 2025):

If you don't use any websocket manager then you have a bottleneck there. Without Redis, the task system falls back to local in-memory storage, which can cause enqueue failures.

<!-- gh-comment-id:3169166442 --> @rgaricano commented on GitHub (Aug 8, 2025): If you don't use any websocket manager then you have a bottleneck there. Without Redis, the task system falls back to local in-memory storage, which can cause enqueue failures.
Author
Owner

@BAngelis commented on GitHub (Aug 8, 2025):

@rgaricano Good to know, sounds like whats happening (especially with multiple clients having chat sessions with several requests/responses). I'll dig into the docs on using Redis w/ Openweb. Thanks!

<!-- gh-comment-id:3169191106 --> @BAngelis commented on GitHub (Aug 8, 2025): @rgaricano Good to know, sounds like whats happening (especially with multiple clients having chat sessions with several requests/responses). I'll dig into the docs on using Redis w/ Openweb. Thanks!
Author
Owner

@rgaricano commented on GitHub (Aug 8, 2025):

https://docs.openwebui.com/tutorials/integrations/redis

<!-- gh-comment-id:3169220044 --> @rgaricano commented on GitHub (Aug 8, 2025): https://docs.openwebui.com/tutorials/integrations/redis
Author
Owner

@BAngelis commented on GitHub (Aug 9, 2025):

@rgaricano Okay, I've integrated Redis. But I can still create the hang/stall issue the same way. Two clients doing Q&A against models that use Pinecone as their Knowledge bases.

Image

BUT I did get an additional clue. After 15 minutes the hang/stall is released and more log records are written (and clients error out simultaneously), I get these log ERRORs

`2025-08-08 23:20:52.254 | ERROR | open_webui.socket.main:periodic_usage_pool_cleanup:157 - Unable to renew cleanup lock. Exiting usage pool cleanup. - {}
2025-08-08 23:20:52.254 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 3 - {}
2025-08-08 23:20:52.255 | ERROR | asyncio.runners:run:118 - Task exception was never retrieved
future: <Task finished name='Task-4' coro=<periodic_usage_pool_cleanup() done, defined at /app/backend/open_webui/socket/main.py:133> exception=Exception('Unable to renew usage pool cleanup lock.')> - {}
Traceback (most recent call last):

File "/app/backend/open_webui/socket/main.py", line 158, in periodic_usage_pool_cleanup
raise Exception("Unable to renew usage pool cleanup lock.")

Exception: Unable to renew usage pool cleanup lock.`

Here's a client side logs if needed

Image
<!-- gh-comment-id:3169545253 --> @BAngelis commented on GitHub (Aug 9, 2025): @rgaricano Okay, I've integrated Redis. But I can still create the hang/stall issue the same way. Two clients doing Q&A against models that use Pinecone as their Knowledge bases. <img width="2213" height="1403" alt="Image" src="https://github.com/user-attachments/assets/a90133d6-c676-4da3-9bdd-28ca2574a82a" /> ### *BUT* I did get an additional clue. After 15 minutes the hang/stall is released and more log records are written (and clients error out simultaneously), I get these log ERRORs `2025-08-08 23:20:52.254 | ERROR | open_webui.socket.main:periodic_usage_pool_cleanup:157 - Unable to renew cleanup lock. Exiting usage pool cleanup. - {} 2025-08-08 23:20:52.254 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:736 - generate_openai_batch_embeddings:model text-embedding-3-small batch size: 3 - {} 2025-08-08 23:20:52.255 | ERROR | asyncio.runners:run:118 - Task exception was never retrieved future: <Task finished name='Task-4' coro=<periodic_usage_pool_cleanup() done, defined at /app/backend/open_webui/socket/main.py:133> exception=Exception('Unable to renew usage pool cleanup lock.')> - {} Traceback (most recent call last): > File "/app/backend/open_webui/socket/main.py", line 158, in periodic_usage_pool_cleanup raise Exception("Unable to renew usage pool cleanup lock.") Exception: Unable to renew usage pool cleanup lock.` ### Here's a client side logs if needed <img width="1357" height="1427" alt="Image" src="https://github.com/user-attachments/assets/db210f05-f4e8-45b2-87f0-e95857e6bbb6" />
Author
Owner

@rgaricano commented on GitHub (Aug 9, 2025):

yes, it's a connection interruption, of web and websockets,
did you check the Azure VM Load Balancer (maybe you have some of this kind of services enabled): https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-tcp-idle-timeout?source=recommendations&tabs=tcp-reset-idle-portal

(sorry but I don't not so much about azure services)

<!-- gh-comment-id:3169612872 --> @rgaricano commented on GitHub (Aug 9, 2025): yes, it's a connection interruption, of web and websockets, did you check the Azure VM Load Balancer (maybe you have some of this kind of services enabled): https://learn.microsoft.com/en-us/azure/load-balancer/load-balancer-tcp-idle-timeout?source=recommendations&tabs=tcp-reset-idle-portal (sorry but I don't not so much about azure services)
Author
Owner

@BAngelis commented on GitHub (Aug 11, 2025):

@rgaricano Thanks for the feedback. I'm not yet using a load balancer, it's just a single VM instance running the entire configuration in Docker.

I'm going to remove my use of Pinecone as the Knowledge Base vector store. And re-run my tests.

<!-- gh-comment-id:3176083053 --> @BAngelis commented on GitHub (Aug 11, 2025): @rgaricano Thanks for the feedback. I'm not yet using a load balancer, it's just a single VM instance running the entire configuration in Docker. I'm going to remove my use of Pinecone as the Knowledge Base vector store. And re-run my tests.
Author
Owner

@BAngelis commented on GitHub (Aug 11, 2025):

@rgaricano thanks for your support.

So far I've run the test suite three times using the default built in ChromaDB support, and no failures. So, the hang/stall on Openweb looks to be related to my use of Pinecone as the vector store.

According to the Openweb documentation, Pinecone is supported. But it doesn't work for me. Worked for basic chat, but not with multiple concurrent users doing typical research scenarios (long contexts, using many prompts). Its not a rate limit problem on the Pinceone side, but I am requesting the activity logs from them to see timing of API calls coming from my Openweb. To see what happens during that 13 to 15 minute OW stall after calling open_webui.retrieval.utils:generate_openai_batch_embeddings.

<!-- gh-comment-id:3176728515 --> @BAngelis commented on GitHub (Aug 11, 2025): @rgaricano thanks for your support. So far I've run the test suite three times using the default built in ChromaDB support, and no failures. So, the hang/stall on Openweb looks to be related to my use of Pinecone as the vector store. According to the Openweb documentation, Pinecone is supported. But it doesn't work for me. Worked for basic chat, but not with multiple concurrent users doing typical research scenarios (long contexts, using many prompts). Its not a rate limit problem on the Pinceone side, but I am requesting the activity logs from them to see timing of API calls coming from my Openweb. To see what happens during that 13 to 15 minute OW stall after calling open_webui.retrieval.utils:generate_openai_batch_embeddings.
Author
Owner

@theboyr commented on GitHub (Aug 12, 2025):

I ran into something similar after upgrading this weekend.

"Metadata value must be a string, number, boolean or list of strings" is an error that Pinecone throws that I'm getting a lot of lately. Pinecone does not support complex metadata.. but it appears OUI is now adding nested Metadata into Upsert to Pinecone.

I had planned to switch to S3 Vectors anyways so just a forcing function then i guess.

<!-- gh-comment-id:3177296449 --> @theboyr commented on GitHub (Aug 12, 2025): I ran into something similar after upgrading this weekend. "Metadata value must be a string, number, boolean or list of strings" is an error that Pinecone throws that I'm getting a lot of lately. Pinecone does not support complex metadata.. but it appears OUI is now adding nested Metadata into Upsert to Pinecone. I had planned to switch to S3 Vectors anyways so just a forcing function then i guess.
Author
Owner

@BAngelis commented on GitHub (Aug 15, 2025):

@theboyr , @rgaricano , I had to fall back to ChromaDB (which does not meet our requirements), just to get it to work for now. Unfortunately the Pinecone support, and now I'm learning n8n integration are too buggy...

<!-- gh-comment-id:3192950309 --> @BAngelis commented on GitHub (Aug 15, 2025): @theboyr , @rgaricano , I had to fall back to ChromaDB (which does not meet our requirements), just to get it to work for now. Unfortunately the Pinecone support, and now I'm learning n8n integration are too buggy...
Author
Owner

@theboyr commented on GitHub (Aug 16, 2025):

I moved everything to S3 Vectors when this happened, and works flawlessly for my use case.

I do think the Pinecone integration needs to be updated to remove any metadata that's nested, but we're an AWS shop to begin with and moving to S3 Vectors is better long term for our management. Plus, once we get out of free tier with Pinecone... it appears S3 Vectors will also be more cost effective.

<!-- gh-comment-id:3193728103 --> @theboyr commented on GitHub (Aug 16, 2025): I moved everything to S3 Vectors when this happened, and works flawlessly for my use case. I do think the Pinecone integration needs to be updated to remove any metadata that's nested, but we're an AWS shop to begin with and moving to S3 Vectors is better long term for our management. Plus, once we get out of free tier with Pinecone... it appears S3 Vectors will also be more cost effective.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#56473