mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #18686] issue: calling the /api/v1/retireval/process/web/search causes entire website to hang #18671
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @OAburub on GitHub (Oct 28, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/18686
Check Existing Issues
Installation Method
Docker
Open WebUI Version
0.6.34
Ollama Version (if applicable)
No response
Operating System
ubuntu 24.04
Browser (if applicable)
No response
Confirmation
README.md.Expected Behavior
The api endpoint should process the web search and loading asynchronously without causing the entire website to hang
Actual Behavior
The web loading part executes in the main thread, causing other functions of the website to hang (this is clearer if the web loading enters a retry loop)
Steps to Reproduce
a. The generated bearer token from the accounts page in the settings
b. Environment Variable env=dev and the Swagger docs page (/docs)
c. a browser where you have logged in to open-webui
Logs & Screenshots
Technically speaking, the execution is working properly, it's just executing in the main thread rather than it's own task/another thread. No relevant logs or errors exist, and posting a screenshot of the loading circle is probably unhelpful.
Additional Information
Observing the process_web_search function in backend/open_webui/routers/retrieval.py, Here's what I think is happenning:
The search tasks are delegated to run_in_threadpool (meaning they don't execute in the main thread):
However, after getting the search results, the web loader starts loading the results in the main thread:
loader.aload will always attempt to connect with the pages and scrape their contents. The reason why this is more observable when encountering problematic URLS is that the retry loop also calls asyncio.sleep
in backend/open_webui/retrieval/web/utils.py,
case 1: default web loader, line 526:
case 2: every other web loader, which all implement RateLimitMixin in line 124:
I'm fairly certain the solution is to delegate the web loading to a seperate task.