Download as text is incomplete when you have multiple models answering the question in the same chat. #1599

Closed
opened 2025-11-11 14:48:08 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @nviraj on GitHub (Jul 24, 2024).

Bug Report

Description

Bug Summary:
Download as text is incomplete when you have multiple models answering the question in the same chat.

Steps to Reproduce:
This is what I did. Choose Nemo, llama 3.1 and Gemma 2 to all answer a question in the same thread.
image

Expected Behavior:
Ideally, I would expect the download to include responses from all these models.

Actual Behavior:
When downloading as plain text, it exports the last generated question / first model selected only. Not sure which of these it was.

Environment

  • Open WebUI Version: 0.3.8

  • Ollama (if applicable): 0.2.8

  • Operating System: Windows 11

  • Browser (if applicable): Firefox 128.0

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Installation Method

Manual install via pip in a conda environment

Originally created by @nviraj on GitHub (Jul 24, 2024). # Bug Report ## Description **Bug Summary:** Download as text is incomplete when you have multiple models answering the question in the same chat. **Steps to Reproduce:** This is what I did. Choose Nemo, llama 3.1 and Gemma 2 to all answer a question in the same thread. ![image](https://github.com/user-attachments/assets/2f6d3931-ad94-41d5-a27c-f06f4a67eac6) **Expected Behavior:** Ideally, I would expect the download to include responses from all these models. **Actual Behavior:** When downloading as plain text, it exports the last generated question / first model selected only. Not sure which of these it was. ## Environment - **Open WebUI Version:** 0.3.8 - **Ollama (if applicable):** 0.2.8 - **Operating System:** Windows 11 - **Browser (if applicable):** Firefox 128.0 ## Reproduction Details **Confirmation:** - [ ] I have read and followed all the instructions provided in the README.md. - [ ] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. ## Installation Method Manual install via pip in a conda environment
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1599