Stops polling/fetching response from model/ollama when switching to other chat or other part of window like Admin panel #2832

Closed
opened 2025-11-11 15:15:22 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @navilg on GitHub (Nov 26, 2024).

Bug Report

Installation Method

[Describe the method you used to install the project, e.g., git clone, Docker, pip, etc.]

Docker as well as pip

Environment

  • Open WebUI Version: 0.4.1

  • Ollama (if applicable): 0.4.5

  • Operating System: Windows 11

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

When I send my prompt to model and till model doesnot respond back, and I switch to other chat, My prompt should not dissapear and it should continue polling for response from model.

Actual Behavior:

When I send my prompt to model and till model doesnot respond back, and I switch to other chat, When I return back to chat, My prompt is disappearing and also I do not see polling anymore. And I need to send prompt again and stay on the same chat until I get some response back from model.

Description

Bug Summary:
When I switch chat or when I switch to admin panel in same window while webui is waiting for response from model/ollama, My current prompt disappears and it also stops polling for response from model/ollama.

Reproduction Details

Steps to Reproduce:

  1. Install ollama and webui.
  2. This can be reproducible on slower CPU/GPU or larger models so that response take atleast few seconds before responding.
  3. Send some prompt to a model ("Tell me some interesting facts about Pharaohs")
  4. When webui is waiting for response from model (Grey shades line which means loading/waiting), Switch to some other chat or click on New chat
  5. Come back to the chat again and you will see prompt sent by you has disappeared and there is also no response from model/ollama neither their is loading grey lines.

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots/Screen Recordings (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @navilg on GitHub (Nov 26, 2024). # Bug Report ## Installation Method [Describe the method you used to install the project, e.g., git clone, Docker, pip, etc.] Docker as well as pip ## Environment - **Open WebUI Version:** 0.4.1 - **Ollama (if applicable):** 0.4.5 - **Operating System:** Windows 11 **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: When I send my prompt to model and till model doesnot respond back, and I switch to other chat, My prompt should not dissapear and it should continue polling for response from model. ## Actual Behavior: When I send my prompt to model and till model doesnot respond back, and I switch to other chat, When I return back to chat, My prompt is disappearing and also I do not see polling anymore. And I need to send prompt again and stay on the same chat until I get some response back from model. ## Description **Bug Summary:** When I switch chat or when I switch to admin panel in same window while webui is waiting for response from model/ollama, My current prompt disappears and it also stops polling for response from model/ollama. ## Reproduction Details **Steps to Reproduce:** 1. Install ollama and webui. 2. This can be reproducible on slower CPU/GPU or larger models so that response take atleast few seconds before responding. 3. Send some prompt to a model ("Tell me some interesting facts about Pharaohs") 4. When webui is waiting for response from model (Grey shades line which means loading/waiting), Switch to some other chat or click on New chat 5. Come back to the chat again and you will see prompt sent by you has disappeared and there is also no response from model/ollama neither their is loading grey lines. ## Logs and Screenshots **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** [Include relevant Docker container logs, if applicable] **Screenshots/Screen Recordings (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@tjbck commented on GitHub (Nov 26, 2024):

#2647

@tjbck commented on GitHub (Nov 26, 2024): #2647
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#2832