[GH-ISSUE #5035] Request to ollama cancelled after 120 seconds #13835

Closed
opened 2026-04-19 20:25:30 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @mchoma on GitHub (Aug 30, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/5035

Creating RAG prompt from Open WebUI to Ollama. This is expected to last long so I configured AIOHTTP_CLIENT_TIMEOUT=3600 (1 hour) OLLAMA_KEEP_ALIVE is 5 minutes.

I am facing situation that ollama cancelled request after 2 minutes

time=2024-08-30T06:35:39.869Z level=INFO source=server.go:623 msg="llama runner started in 1.76 seconds"
time=2024-08-30T06:35:39.869Z level=DEBUG source=sched.go:458 msg="finished setting up runner" model=/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa
time=2024-08-30T06:35:39.869Z level=DEBUG source=routes.go:1346 msg="chat request" images=0 prompt="<|start_header_id|>user<|end_header_id|>\n\nUse the following context as your learned knowledge, inside <context></context> XML tags.\n<context>...</context>
DEBUG [process_single_task] slot data | n_idle_slots=1 n_processing_slots=0 task_id=1 tid="139885828540288" timestamp=1724999739
DEBUG [launch_slot_with_data] slot is processing task | slot_id=0 task_id=2 tid="139885828540288" timestamp=1724999739
DEBUG [update_slots] slot progression | ga_i=0 n_past=0 n_past_se=0 n_prompt_tokens_processed=1664 slot_id=0 task_id=2 tid="139885828540288" timestamp=1724999739
DEBUG [update_slots] kv cache rm [p0, end) | p0=0 slot_id=0 task_id=2 tid="139885828540288" timestamp=1724999739
time=2024-08-30T06:37:41.286Z level=DEBUG source=sched.go:462 msg="context for request finished"
[GIN] 2024/08/30 - 06:37:41 | 200 | 2m3s | 172.215.1.9 | POST "/api/chat"
time=2024-08-30T06:37:41.286Z level=DEBUG source=sched.go:334 msg="runner with non-zero duration has gone idle, adding timer" modelPath=/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa duration=5m0s
time=2024-08-30T06:37:41.286Z level=DEBUG source=sched.go:352 msg="after processing request finished event" modelPath=/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa refCount=0
DEBUG [log_server_request] request | method="POST" params={} path="/completion" remote_addr="127.0.0.1" remote_port=57340 status=200 tid="139885740594752" timestamp=1724999861
DEBUG [update_slots] slot released | n_cache_tokens=1670 n_ctx=2048 n_past=1669 n_system_tokens=0 slot_id=0 task_id=2 tid="139885828540288" timestamp=1724999861 truncated=false

I do not know if I must set something on OpenWebUI side or Ollama side or OpenShift . I raised OpenShift Router HAProxy timeout, but did not helped.

Not sure where this 120 seconds timeout comes from. I will be thankful for any hint.

Using
Open WebUI Version v0.3.16
Ollama 0.3.6

Installation Method

container on OpenShift

Environment

  • Open WebUI Version: v0.3.16

  • Ollama (if applicable): v0.3.6

  • Operating System: Linux

  • Browser (if applicable): Firefox

Originally created by @mchoma on GitHub (Aug 30, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/5035 Creating RAG prompt from Open WebUI to Ollama. This is expected to last long so I configured AIOHTTP_CLIENT_TIMEOUT=3600 (1 hour) OLLAMA_KEEP_ALIVE is 5 minutes. I am facing situation that ollama cancelled request after 2 minutes ``` time=2024-08-30T06:35:39.869Z level=INFO source=server.go:623 msg="llama runner started in 1.76 seconds" time=2024-08-30T06:35:39.869Z level=DEBUG source=sched.go:458 msg="finished setting up runner" model=/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa time=2024-08-30T06:35:39.869Z level=DEBUG source=routes.go:1346 msg="chat request" images=0 prompt="<|start_header_id|>user<|end_header_id|>\n\nUse the following context as your learned knowledge, inside <context></context> XML tags.\n<context>...</context> DEBUG [process_single_task] slot data | n_idle_slots=1 n_processing_slots=0 task_id=1 tid="139885828540288" timestamp=1724999739 DEBUG [launch_slot_with_data] slot is processing task | slot_id=0 task_id=2 tid="139885828540288" timestamp=1724999739 DEBUG [update_slots] slot progression | ga_i=0 n_past=0 n_past_se=0 n_prompt_tokens_processed=1664 slot_id=0 task_id=2 tid="139885828540288" timestamp=1724999739 DEBUG [update_slots] kv cache rm [p0, end) | p0=0 slot_id=0 task_id=2 tid="139885828540288" timestamp=1724999739 time=2024-08-30T06:37:41.286Z level=DEBUG source=sched.go:462 msg="context for request finished" [GIN] 2024/08/30 - 06:37:41 | 200 | 2m3s | 172.215.1.9 | POST "/api/chat" time=2024-08-30T06:37:41.286Z level=DEBUG source=sched.go:334 msg="runner with non-zero duration has gone idle, adding timer" modelPath=/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa duration=5m0s time=2024-08-30T06:37:41.286Z level=DEBUG source=sched.go:352 msg="after processing request finished event" modelPath=/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa refCount=0 DEBUG [log_server_request] request | method="POST" params={} path="/completion" remote_addr="127.0.0.1" remote_port=57340 status=200 tid="139885740594752" timestamp=1724999861 DEBUG [update_slots] slot released | n_cache_tokens=1670 n_ctx=2048 n_past=1669 n_system_tokens=0 slot_id=0 task_id=2 tid="139885828540288" timestamp=1724999861 truncated=false ``` I do not know if I must set something on OpenWebUI side or Ollama side or OpenShift . I raised OpenShift Router HAProxy timeout, but did not helped. Not sure where this 120 seconds timeout comes from. I will be thankful for any hint. Using Open WebUI Version v0.3.16 Ollama 0.3.6 ## Installation Method container on OpenShift ## Environment - **Open WebUI Version:** v0.3.16 - **Ollama (if applicable):** v0.3.6 - **Operating System:** Linux - **Browser (if applicable):** Firefox
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#13835