[GH-ISSUE #23020] feat: Auto-reconnect and Resume Chat After Ollama Server Disconnection #35399

Closed
opened 2026-04-25 09:36:38 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @aheybati on GitHub (Mar 25, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/23020

Check Existing Issues

  • I have searched for all existing open AND closed issues and discussions for similar requests. I have found none that is comparable to my request.

Verify Feature Scope

  • I have read through and understood the scope definition for feature requests in the Issues section. I believe my feature request meets the definition and belongs in the Issues section instead of the Discussions.

Problem Description

Description:

Is your feature request related to a problem? Please describe.
When the connection to an external Ollama server is lost (network issues, server restart, VPN disconnection, etc.), the current chat session becomes unusable. Any message being processed is lost, and users must manually retry their messages after the connection is restored. There is no automatic reconnection mechanism to resume the chat seamlessly.

Describe the solution you'd like
I would like Open WebUI to implement an auto-reconnect and resume feature that:

 Detects disconnection - Detects when the Ollama server connection is lost
 Preserves pending messages - Stores in-progress messages when disconnection occurs
 Auto-reconnects - Automatically attempts to reconnect when the server becomes available again
 Resumes chat - Continues the conversation from where it left off without requiring manual intervention

Describe alternatives you've considered

Increasing AIOHTTP_CLIENT_TIMEOUT helps with slow responses but doesn't handle complete disconnections
ENABLE_REALTIME_CHAT_SAVE saves chat data but doesn't resume in-progress requests
Manual retry after reconnection works but requires user intervention

Additional context
This feature would be especially valuable for:

Users connecting to remote Ollama servers over VPN
Environments with unstable network connections
Long-running conversations where reconnection should be transparent to the user
Production deployments requiring high availability

Desired Solution you'd like

I would like Open WebUI to implement an auto-reconnect mechanism that:

  1. Detects when Ollama connection is lost and shows connection status to the user
  2. Queues pending messages instead of discarding them when disconnection occurs
  3. Automatically retries connection at configurable intervals
  4. Resumes the pending requests once connection is restored

Suggested environment variables:

  • ENABLE_AUTO_RECONNECT (default: true)
  • RECONNECT_MAX_RETRIES (default: 3)
  • RECONNECT_INTERVAL_SECONDS (default: 5)
  • RECONNECT_BACKOFF_MULTIPLIER (default: 1.5)

The chat history should never be lost, and users should be able to continue their conversation seamlessly after temporary network issues without manual message resubmission.

Alternatives Considered

No response

Additional Context

No response

Originally created by @aheybati on GitHub (Mar 25, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/23020 ### Check Existing Issues - [x] I have searched for all existing **open AND closed** issues and discussions for similar requests. I have found none that is comparable to my request. ### Verify Feature Scope - [x] I have read through and understood the scope definition for feature requests in the Issues section. I believe my feature request meets the definition and belongs in the Issues section instead of the Discussions. ### Problem Description Description: Is your feature request related to a problem? Please describe. When the connection to an external Ollama server is lost (network issues, server restart, VPN disconnection, etc.), the current chat session becomes unusable. Any message being processed is lost, and users must manually retry their messages after the connection is restored. There is no automatic reconnection mechanism to resume the chat seamlessly. Describe the solution you'd like I would like Open WebUI to implement an auto-reconnect and resume feature that: Detects disconnection - Detects when the Ollama server connection is lost Preserves pending messages - Stores in-progress messages when disconnection occurs Auto-reconnects - Automatically attempts to reconnect when the server becomes available again Resumes chat - Continues the conversation from where it left off without requiring manual intervention Describe alternatives you've considered Increasing AIOHTTP_CLIENT_TIMEOUT helps with slow responses but doesn't handle complete disconnections ENABLE_REALTIME_CHAT_SAVE saves chat data but doesn't resume in-progress requests Manual retry after reconnection works but requires user intervention Additional context This feature would be especially valuable for: Users connecting to remote Ollama servers over VPN Environments with unstable network connections Long-running conversations where reconnection should be transparent to the user Production deployments requiring high availability ### Desired Solution you'd like I would like Open WebUI to implement an auto-reconnect mechanism that: 1. Detects when Ollama connection is lost and shows connection status to the user 2. Queues pending messages instead of discarding them when disconnection occurs 3. Automatically retries connection at configurable intervals 4. Resumes the pending requests once connection is restored Suggested environment variables: - ENABLE_AUTO_RECONNECT (default: true) - RECONNECT_MAX_RETRIES (default: 3) - RECONNECT_INTERVAL_SECONDS (default: 5) - RECONNECT_BACKOFF_MULTIPLIER (default: 1.5) The chat history should never be lost, and users should be able to continue their conversation seamlessly after temporary network issues without manual message resubmission. ### Alternatives Considered _No response_ ### Additional Context _No response_
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#35399