[PR #20571] [MERGED] fix(db): release connection before LLM call in Ollama /api/chat #41306

Closed
opened 2026-04-25 13:34:37 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/open-webui/open-webui/pull/20571
Author: @Classic298
Created: 1/11/2026
Status: Merged
Merged: 1/11/2026
Merged by: @tjbck

Base: devHead: fix/db-pool-ollama-api-chat


📝 Commits (1)

  • 3834e62 fix(db): release connection before LLM call in Ollama /api/chat

📊 Changes

1 file changed (+5 additions, -3 deletions)

View changed files

📝 backend/open_webui/routers/ollama.py (+5 -3)

📄 Description

Remove Depends(get_session) from the /api/chat endpoint to prevent database connections from being held during the entire duration of LLM calls (30-60+ seconds for streaming responses).

Previously, the database session was acquired at request start and held until the streaming response completed. Under concurrent load, this exhausted the connection pool, causing QueuePool timeout errors for other database operations.

The fix allows Models.get_model_by_id() and has_access() to manage their own short-lived sessions internally, releasing the connection immediately after the quick authorization checks complete - before the slow external LLM API call begins.

Contributor License Agreement

By submitting this pull request, I confirm that I have read and fully agree to the Contributor License Agreement (CLA), and I am providing my contributions under its terms.

Note

Deleting the CLA section will lead to immediate closure of your PR and it will not be merged in.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/open-webui/open-webui/pull/20571 **Author:** [@Classic298](https://github.com/Classic298) **Created:** 1/11/2026 **Status:** ✅ Merged **Merged:** 1/11/2026 **Merged by:** [@tjbck](https://github.com/tjbck) **Base:** `dev` ← **Head:** `fix/db-pool-ollama-api-chat` --- ### 📝 Commits (1) - [`3834e62`](https://github.com/open-webui/open-webui/commit/3834e626a49b76a37ac953da052e214e3017f3fb) fix(db): release connection before LLM call in Ollama /api/chat ### 📊 Changes **1 file changed** (+5 additions, -3 deletions) <details> <summary>View changed files</summary> 📝 `backend/open_webui/routers/ollama.py` (+5 -3) </details> ### 📄 Description Remove Depends(get_session) from the /api/chat endpoint to prevent database connections from being held during the entire duration of LLM calls (30-60+ seconds for streaming responses). Previously, the database session was acquired at request start and held until the streaming response completed. Under concurrent load, this exhausted the connection pool, causing QueuePool timeout errors for other database operations. The fix allows Models.get_model_by_id() and has_access() to manage their own short-lived sessions internally, releasing the connection immediately after the quick authorization checks complete - before the slow external LLM API call begins. ### Contributor License Agreement By submitting this pull request, I confirm that I have read and fully agree to the [Contributor License Agreement (CLA)](https://github.com/open-webui/open-webui/blob/main/CONTRIBUTOR_LICENSE_AGREEMENT), and I am providing my contributions under its terms. > [!NOTE] > Deleting the CLA section will lead to immediate closure of your PR and it will not be merged in. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-25 13:34:37 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#41306