mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-22 06:02:06 -05:00
issue: synchronous sqlalchemy usage blocks the event-loop, seriously harming responsiveness #4263
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @lattwood on GitHub (Mar 5, 2025).
Check Existing Issues
Installation Method
Git Clone
Open WebUI Version
n/a (master)
Ollama Version (if applicable)
No response
Operating System
n/a
Browser (if applicable)
n/a
Confirmation
README.md.Expected Behavior
Database queries should be executed asynchronously when being run via async request handlers.
Actual Behavior
Database queries in request handlers are executed synchronously from a asynchronous context, blocking the Python async event-loop.
Steps to Reproduce
4770285c04/backend/open_webui/models/chats.py (L108)4770285c04/backend/open_webui/routers/chats.py (L97-L100)view those links? Do some grepping? :)
Logs & Screenshots
N/A
Additional Information
I am assuming a high level understanding of event-loops as they relate to async Python, but I am happy to explain where needed. Around blockage specifically, FastAPI is unable to accept connections, respond to other requests, etc, all while Python/SQLAlchemy is blocking the event loop waiting for a response from the database.
I have a lot of empathy for the team & project over this, because I've had to deal with this before on closed source projects. It sucks there isn't more widespread knowledge about this issue, but at least there's usually three ways to solve it. In this case there's two, but I'll include all three options for the sake of completeness, and I've provided a python sample at the bottom of this issue that will demonstrate the ultimate impact of this problem, as well as how the 3 options listed can solve it.
Dependsblocks. See the get_session_user handler and its dependency, get_current_user.asyncio.gatherusage at the very least.run_in_threadpoolmethod. This is what FastAPI does internally when calling synchronous handlers or doingDependsresolution. An example is below.create_engine_asyncvscreate_engine.Here's an example of the decorator solution that creates async versions of all methods on a class and prefixes them with
async_. The example also demonstrates what the UX is for handling concurrent HTTP requests on a single FastAPI/Uvicorn worker for the current state of affairs, as well as the three options mentioned above, through the use ofasyncio.gather.And here is the output when I run it locally.
@tjbck commented on GitHub (Mar 5, 2025):
Second option seems to be the best for now, would appreciate PRs here!
@gaboe commented on GitHub (Mar 12, 2025):
Implemented Solution: Automatically Generated Async Files with Type Safety
I've implemented a solution to this issue that takes a different approach from the suggestions, specifically addressing both the blocking event loop and the typing challenges that arise when working with async wrappers.
Our Approach: Separate Generated Async Modules
Instead of adding async methods to existing classes or using run_in_threadpool directly, we generate entire separate async modules with properly typed interfaces. Here's why we took this route:
Type Safety Challenges
The approach suggested in the original issue (adding async_ prefixed methods via a decorator) works functionally but creates significant typing problems:
We initially tried solving this with PYI stub files, but this approach also had problems:
Our Solution:
@asyncifywith Module GenerationOur decorator creates entire separate async modules with full typing support:
This generates a
channels_async.pyfile with:Key Benefits of This Approach
Complete Type Safety: Both sync and async interfaces have proper typing that IDEs and type checkers can validate.
Clean Import Separation: Code can explicitly import from either the sync or async module based on its needs:
Minimal Runtime Overhead: No decorator overhead in production; all wrappers are generated at development time.
Parameter Preservation: Our generator properly handles method signatures, preserving all parameters.
Automatic Rebuilding: Files are only regenerated when the source changes (tracked via hash).
Clear Documentation: Generated async methods include docstrings that reference their sync counterparts.
No Event Loop Blocking: All database operations run in a thread pool, preventing FastAPI event loop blocking.
Implementation Notes
Unlike the original example, our implementation specifically:
Performance Impact
We've seen significant improvements in concurrent request handling with this approach. The event loop stays responsive even under heavy database load, and requests are properly parallelized across the threadpool.
The code for our asyncify decorator is more sophisticated than the example in the issue, as it handles parameter passing, method signatures, and file generation with proper typing.
This approach strikes a good balance between developer experience (DX) and runtime performance without requiring a major rewrite of the codebase to fully async SQLAlchemy.
asyncify.pyWhat do you guys say about that?
@tjbck commented on GitHub (Mar 14, 2025):
https://asyncer.tiangolo.com/ seems like a great option, PR Welcome!