[GH-ISSUE #13992] issue: Apparent State Sync Issue with OpenAI API from LocalAI #17097

Closed
opened 2026-04-19 22:51:15 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @sempervictus on GitHub (May 17, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/13992

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

0.6.9

Ollama Version (if applicable)

No response

Operating System

22.04

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

Submitting a processing request to a model should yield a response

Actual Behavior

After some time being connected to the localai API in the same docker environment, openwebui "loses track" of model interactions and returns empty responses while processing is clearly still ongoing. Other clients of the same API continue to work correctly including localai's own chat service and flowise.

Steps to Reproduce

  1. Deploy localai and open-webui in a docker environment (we also have qdrant wired for open-webui container, in case that's "a thing")
  2. Configure OpenWebUI to use LocalAI's API
  3. Access llama4, command-a, command-r, or any other fairly complex model for several iterations of conversation using a small model like phi4-mini-reasoning for tasking. Starting separate conversations seems to make this happen more readily especially when pushing web search/tool use.
  4. Observe 0 sites searched in process and empty result with a completion notification even though no EOT was sent

Logs & Screenshots

Image

Additional Information

We've seen this when running against the pre-packaged ollama version of the container as well but much more rarely so i'm guessing this may lie in some common state tracking logic shared between the APIs but exercised more with the OpenAI targets. May be a case of common expression for different root causes but figure its worth noting.

Originally created by @sempervictus on GitHub (May 17, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/13992 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version 0.6.9 ### Ollama Version (if applicable) _No response_ ### Operating System 22.04 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior Submitting a processing request to a model should yield a response ### Actual Behavior After some time being connected to the localai API in the same docker environment, openwebui "loses track" of model interactions and returns empty responses while processing is clearly still ongoing. Other clients of the same API continue to work correctly including localai's own chat service and flowise. ### Steps to Reproduce 1. Deploy `localai` and `open-webui` in a docker environment (we also have `qdrant` wired for `open-webui` container, in case that's "a thing") 2. Configure OpenWebUI to use LocalAI's API 3. Access llama4, command-a, command-r, or any other fairly complex model for several iterations of conversation using a small model like phi4-mini-reasoning for tasking. Starting separate conversations seems to make this happen more readily especially when pushing web search/tool use. 4. Observe 0 sites searched in process and empty result with a completion notification even though no EOT was sent ### Logs & Screenshots ![Image](https://github.com/user-attachments/assets/24239b89-b4cf-4f04-9786-1c87b44c2d13) ### Additional Information We've seen this when running against the pre-packaged `ollama` version of the container as well but much more rarely so i'm _guessing_ this may lie in some common state tracking logic shared between the APIs but exercised more with the OpenAI targets. May be a case of common expression for different root causes but figure its worth noting.
GiteaMirror added the bug label 2026-04-19 22:51:15 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#17097