[GH-ISSUE #22206] issue: [Critical] Multiple API endpoints load entire dataset into memory at once, causing OOM crash and service unavailability #58327

Open
opened 2026-05-05 22:55:13 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @ShirasawaSama on GitHub (Mar 4, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/22206

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

dev

Ollama Version (if applicable)

No response

Operating System

MacOS

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

These endpoints should use pagination or streaming to handle large datasets, avoiding loading all data into memory at once.

Actual Behavior

These endpoints load the entire dataset into memory at once, causing Out-of-Memory (OOM) crashes and service unavailability when the data volume is large.

Steps to Reproduce

Just call the APIs.

Logs & Screenshots

.

Additional Information

OOM risk by priority


High — Endpoints that will OOM for regular users

Method + Path Notes
GET /chats/all All chats for the current user, no pagination.
GET /evaluations/feedbacks/user All feedback for the current user, includes large JSON.
GET /files/ File list for the current user with content by default; will OOM.

Medium — Endpoints that will OOM for admins

Method + Path Notes
GET /evaluations/feedbacks/all Admin: full feedback table in one response.
GET /evaluations/feedbacks/all/export Admin: full feedback export.
GET /chats/all/db Admin: full chat DB export.

Low — Scale-dependent; usually small

Method + Path Notes
GET /prompts/ Typically small.
GET /models/export Same.
GET /tools/export Same.
GET /skills/export Same.
GET /functions/export Same.
GET /users/all Same.
GET /channels/ (full list) Same.
Originally created by @ShirasawaSama on GitHub (Mar 4, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/22206 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version dev ### Ollama Version (if applicable) _No response_ ### Operating System MacOS ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior These endpoints should use pagination or streaming to handle large datasets, avoiding loading all data into memory at once. ### Actual Behavior These endpoints load the entire dataset into memory at once, causing Out-of-Memory (OOM) crashes and service unavailability when the data volume is large. ### Steps to Reproduce Just call the APIs. ### Logs & Screenshots . ### Additional Information **OOM risk by priority** --- ## **High — Endpoints that will OOM for regular users** | Method + Path | Notes | |---------------|--------| | `GET /chats/all` | All chats for the current user, no pagination. | | `GET /evaluations/feedbacks/user` | All feedback for the current user, includes large JSON. | | `GET /files/` | File list for the current user with content by default; will OOM. | --- ## **Medium — Endpoints that will OOM for admins** | Method + Path | Notes | |---------------|--------| | `GET /evaluations/feedbacks/all` | Admin: full feedback table in one response. | | `GET /evaluations/feedbacks/all/export` | Admin: full feedback export. | | `GET /chats/all/db` | Admin: full chat DB export. | --- ## **Low — Scale-dependent; usually small** | Method + Path | Notes | |---------------|--------| | `GET /prompts/` | Typically small. | | `GET /models/export` | Same. | | `GET /tools/export` | Same. | | `GET /skills/export` | Same. | | `GET /functions/export` | Same. | | `GET /users/all` | Same. | | `GET /channels/` (full list) | Same. |
GiteaMirror added the bug label 2026-05-05 22:55:13 -05:00
Author
Owner

@gaby commented on GitHub (Mar 8, 2026):

@ShirasawaSama This is actually a security issue; it should have been reported privately, not publicly.

Ping @tjbck @Classic298

<!-- gh-comment-id:4020186846 --> @gaby commented on GitHub (Mar 8, 2026): @ShirasawaSama This is actually a security issue; it should have been reported privately, not publicly. Ping @tjbck @Classic298
Author
Owner

@Classic298 commented on GitHub (Mar 8, 2026):

UNTESTED reference-only PR: https://github.com/open-webui/open-webui/pull/22464

<!-- gh-comment-id:4020233015 --> @Classic298 commented on GitHub (Mar 8, 2026): UNTESTED reference-only PR: https://github.com/open-webui/open-webui/pull/22464
Author
Owner

@ShirasawaSama commented on GitHub (Mar 9, 2026):

@gaby This issue has persisted since version 0.5, so a little more time won't make much difference.

<!-- gh-comment-id:4020970346 --> @ShirasawaSama commented on GitHub (Mar 9, 2026): @gaby This issue has persisted since version 0.5, so a little more time won't make much difference.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#58327