[PR #20175] [CLOSED] fix: resolve SQLAlchemy connection exhaustion and enhance LLM reaction context for channels #41119

Closed
opened 2026-04-25 13:26:04 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/open-webui/open-webui/pull/20175
Author: @silentoplayz
Created: 12/25/2025
Status: Closed

Base: devHead: feat/llm-reaction-context-and-fix


📝 Commits (4)

  • 8d7616d feat: enhance LLM context with reactions and fix RichTextInput error
  • 1b8c7a6 fix: resolve infinite recursion and connection pool exhaustion in message retrieval
  • 26e4937 fix: ensure quoted messages are included in channel LLM context
  • 93fea9e refac: this works

📊 Changes

3 files changed (+150 additions, -56 deletions)

View changed files

📝 backend/open_webui/models/messages.py (+107 -54)
📝 backend/open_webui/routers/channels.py (+41 -2)
📝 src/lib/components/common/RichTextInput.svelte (+2 -0)

📄 Description

Pull Request Checklist

Note to first-time contributors: Please open a discussion post in Discussions to discuss your idea/fix with the community before creating a pull request, and describe your changes before submitting a pull request.

This is to ensure large feature PRs are discussed with the community first, before starting work on it. If the community does not want this feature or it is not relevant for Open WebUI as a project, it can be identified in the discussion before working on the feature and submitting the PR.

Before submitting, make sure you've checked the following:

  • Target branch: Verify that the pull request targets the dev branch. Not targeting the dev branch will lead to immediate closure of the PR.
  • Description: Provide a concise description of the changes made in this pull request down below.
  • Changelog: Ensure a changelog entry following the format of Keep a Changelog is added at the bottom of the PR description.
  • Documentation: If necessary, update relevant documentation Open WebUI Docs like environment variables, the tutorials, or other documentation sources.
  • Dependencies: Are there any new dependencies? Have you updated the dependency versions in the documentation?
  • Testing: Perform manual tests to verify the implemented fix/feature works as intended AND does not break any other functionality. Take this as an opportunity to make screenshots of the feature/fix and include it in the PR description.
  • Agentic AI Code: Confirm this Pull Request is not written by any AI Agent or has at least gone through additional human review AND manual testing. If any AI Agent is the co-author of this PR, it may lead to immediate closure of the PR.
  • Code review: Have you performed a self-review of your code, addressing any coding standard issues and ensuring adherence to the project's coding standards?
  • Title Prefix: To clearly categorize this pull request, prefix the pull request title using one of the following:
    • BREAKING CHANGE: Significant changes that may affect compatibility
    • build: Changes that affect the build system or external dependencies
    • ci: Changes to our continuous integration processes or workflows
    • chore: Refactor, cleanup, or other non-functional code changes
    • docs: Documentation update or addition
    • feat: Introduces a new feature or enhancement to the codebase
    • fix: Bug fix or error correction
    • i18n: Internationalization or localization changes
    • perf: Performance improvement
    • refactor: Code restructuring for better maintainability, readability, or scalability
    • style: Changes that do not affect the meaning of the code (white space, formatting, missing semi-colons, etc.)
    • test: Adding missing tests or correcting existing tests
    • WIP: Work in progress, a temporary label for incomplete or ongoing work

Changelog Entry

Description

This PR introduces three key improvements:

  1. LLM Reaction Awareness: Enhances the channel context fed to LLMs by including message reaction data (reaction names and counts) in the message formatting. This allows the LLM to understand and reference user reactions in its responses.
  2. Recursion and Connection Pool Fix: Resolves a critical RecursionError and QueuePool limit exhaustion issue in backend/open_webui/models/messages.py. The recursive fetching of reply_to_message was causing infinite loops and depleting DB connections. This is fixed by:
    • Implementing inline, non-recursive fetching for reply_to_message.
    • Explicitly reusing the SQLAlchemy db session in recursive/nested calls.
    • Correcting Pydantic validation for UserNameResponse to handle ORM objects using from_attributes=True.
  3. RichTextInput Bug Fix: Prevents a RangeError (invalid position) in the RichTextInput.svelte component when the selection depth is 0.

Added

  • Reaction context (name, count, user names) is now appended to message content in model_response_handler within backend/open_webui/routers/channels.py.

Changed

  • Refactored get_message_by_id, get_thread_replies_by_message_id, and get_reactions_by_message_id in backend/open_webui/models/messages.py to accept an optional db session for reuse.
  • Replaced recursive self.get_message_by_id calls for reply_to_message with inline DB queries to break the recursion loop.
  • Updated UserNameResponse validation to use from_attributes=True.

Fixed

  • Fixed RecursionError: maximum recursion depth exceeded when fetching nested message replies.
  • Fixed QueuePool limit of size 5 overflow 10 reached (SQLAlchemy connection exhaustion) by ensuring session reuse.
  • Fixed Uncaught RangeError: There is no position before the top-level node in src/lib/components/common/RichTextInput.svelte.
  • Fixed missing context for quoted messages ("Direct Reply") by explicitly injecting the reply_to_message into the LLM thread history if it's not already present.

Additional Information

  • The database session management refactor is a native solution that avoids the need for valid "lite" message fetching methods, ensuring robust and complete data retrieval without performance penalties from connection overhead.
  • This PR should also solve https://github.com/open-webui/open-webui/issues/20157.
  • This PR would address a feature request of my own
    image

Screenshots

image image image

Contributor License Agreement

By submitting this pull request, I confirm that I have read and fully agree to the Contributor License Agreement (CLA), and I am providing my contributions under its terms.

Note

Deleting the CLA section will lead to immediate closure of your PR and it will not be merged in.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/open-webui/open-webui/pull/20175 **Author:** [@silentoplayz](https://github.com/silentoplayz) **Created:** 12/25/2025 **Status:** ❌ Closed **Base:** `dev` ← **Head:** `feat/llm-reaction-context-and-fix` --- ### 📝 Commits (4) - [`8d7616d`](https://github.com/open-webui/open-webui/commit/8d7616dcd6df7888e7880bcc76212dfacebe4762) feat: enhance LLM context with reactions and fix RichTextInput error - [`1b8c7a6`](https://github.com/open-webui/open-webui/commit/1b8c7a68fc2dcf655865a09a38acb0dd2c49311c) fix: resolve infinite recursion and connection pool exhaustion in message retrieval - [`26e4937`](https://github.com/open-webui/open-webui/commit/26e4937a6462cc7c7e8e7d8f82d3cc762776ef43) fix: ensure quoted messages are included in channel LLM context - [`93fea9e`](https://github.com/open-webui/open-webui/commit/93fea9ee883a64f99f2220df91443496559cd7fd) refac: this works ### 📊 Changes **3 files changed** (+150 additions, -56 deletions) <details> <summary>View changed files</summary> 📝 `backend/open_webui/models/messages.py` (+107 -54) 📝 `backend/open_webui/routers/channels.py` (+41 -2) 📝 `src/lib/components/common/RichTextInput.svelte` (+2 -0) </details> ### 📄 Description # Pull Request Checklist ### Note to first-time contributors: Please open a discussion post in [Discussions](https://github.com/open-webui/open-webui/discussions) to discuss your idea/fix with the community before creating a pull request, and describe your changes before submitting a pull request. This is to ensure large feature PRs are discussed with the community first, before starting work on it. If the community does not want this feature or it is not relevant for Open WebUI as a project, it can be identified in the discussion before working on the feature and submitting the PR. **Before submitting, make sure you've checked the following:** - [X] **Target branch:** Verify that the pull request targets the `dev` branch. **Not targeting the `dev` branch will lead to immediate closure of the PR.** - [X] **Description:** Provide a concise description of the changes made in this pull request down below. - [X] **Changelog:** Ensure a changelog entry following the format of [Keep a Changelog](https://keepachangelog.com/) is added at the bottom of the PR description. - [X] **Documentation:** If necessary, update relevant documentation [Open WebUI Docs](https://github.com/open-webui/docs) like environment variables, the tutorials, or other documentation sources. - [X] **Dependencies:** Are there any new dependencies? Have you updated the dependency versions in the documentation? - [X] **Testing:** Perform manual tests to **verify the implemented fix/feature works as intended AND does not break any other functionality**. Take this as an opportunity to **make screenshots of the feature/fix and include it in the PR description**. - [X] **Agentic AI Code:** Confirm this Pull Request is **not written by any AI Agent** or has at least **gone through additional human review AND manual testing**. If any AI Agent is the co-author of this PR, it may lead to immediate closure of the PR. - [X] **Code review:** Have you performed a self-review of your code, addressing any coding standard issues and ensuring adherence to the project's coding standards? - [X] **Title Prefix:** To clearly categorize this pull request, prefix the pull request title using one of the following: - **BREAKING CHANGE**: Significant changes that may affect compatibility - **build**: Changes that affect the build system or external dependencies - **ci**: Changes to our continuous integration processes or workflows - **chore**: Refactor, cleanup, or other non-functional code changes - **docs**: Documentation update or addition - **feat**: Introduces a new feature or enhancement to the codebase - **fix**: Bug fix or error correction - **i18n**: Internationalization or localization changes - **perf**: Performance improvement - **refactor**: Code restructuring for better maintainability, readability, or scalability - **style**: Changes that do not affect the meaning of the code (white space, formatting, missing semi-colons, etc.) - **test**: Adding missing tests or correcting existing tests - **WIP**: Work in progress, a temporary label for incomplete or ongoing work # Changelog Entry ### Description This PR introduces three key improvements: 1. **LLM Reaction Awareness**: Enhances the channel context fed to LLMs by including message reaction data (reaction names and counts) in the message formatting. This allows the LLM to understand and reference user reactions in its responses. 2. **Recursion and Connection Pool Fix**: Resolves a critical `RecursionError` and `QueuePool limit` exhaustion issue in `backend/open_webui/models/messages.py`. The recursive fetching of `reply_to_message` was causing infinite loops and depleting DB connections. This is fixed by: - Implementing inline, non-recursive fetching for `reply_to_message`. - Explicitly reusing the SQLAlchemy `db` session in recursive/nested calls. - Correcting Pydantic validation for `UserNameResponse` to handle ORM objects using `from_attributes=True`. 3. **RichTextInput Bug Fix**: Prevents a `RangeError` (invalid position) in the `RichTextInput.svelte` component when the selection depth is 0. ### Added - Reaction context (name, count, user names) is now appended to message content in `model_response_handler` within `backend/open_webui/routers/channels.py`. ### Changed - Refactored `get_message_by_id`, `get_thread_replies_by_message_id`, and `get_reactions_by_message_id` in `backend/open_webui/models/messages.py` to accept an optional `db` session for reuse. - Replaced recursive `self.get_message_by_id` calls for `reply_to_message` with inline DB queries to break the recursion loop. - Updated `UserNameResponse` validation to use `from_attributes=True`. ### Fixed - Fixed `RecursionError: maximum recursion depth exceeded` when fetching nested message replies. - Fixed `QueuePool limit of size 5 overflow 10 reached` (SQLAlchemy connection exhaustion) by ensuring session reuse. - Fixed `Uncaught RangeError: There is no position before the top-level node` in `src/lib/components/common/RichTextInput.svelte`. - Fixed missing context for quoted messages ("Direct Reply") by explicitly injecting the `reply_to_message` into the LLM thread history if it's not already present. --- ### Additional Information - The database session management refactor is a native solution that avoids the need for valid "lite" message fetching methods, ensuring robust and complete data retrieval without performance penalties from connection overhead. - This PR should also solve https://github.com/open-webui/open-webui/issues/20157. - This PR would address a feature request of my own <img width="1781" height="306" alt="image" src="https://github.com/user-attachments/assets/9c9a408e-f5c6-4500-bdf7-0d22f0b851d7" /> ## Screenshots <img width="2285" height="1280" alt="image" src="https://github.com/user-attachments/assets/a4b95320-a6bd-49ea-bb6b-1c7a537cae81" /> <img width="2285" height="1280" alt="image" src="https://github.com/user-attachments/assets/dcf4135c-d64a-48cb-ab70-5ab777f9f70f" /> <img width="1141" height="600" alt="image" src="https://github.com/user-attachments/assets/a69c03ca-f395-4237-aec1-1a257a5c2ec8" /> ### Contributor License Agreement By submitting this pull request, I confirm that I have read and fully agree to the [Contributor License Agreement (CLA)](https://github.com/open-webui/open-webui/blob/main/CONTRIBUTOR_LICENSE_AGREEMENT), and I am providing my contributions under its terms. > [!NOTE] > Deleting the CLA section will lead to immediate closure of your PR and it will not be merged in. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-25 13:26:04 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#41119