Chatting with two models at once, second request takes history (first answer) from one model and uses it in request to both models #1821

Closed
opened 2025-11-11 14:53:56 -06:00 by GiteaMirror · 3 comments
Owner

Originally created by @vlsav on GitHub (Aug 19, 2024).

Bug Report

Installation Method

docker

Environment

v0.3.13
no Ollama, two openai compatible endpoints

Linux, RedHat
Firefox 98.0

Confirmation:

  • [ X ] I have read and followed all the instructions provided in the README.md.
  • [ X ] I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • [ X ] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

Chatting with two models at once, second request takes history (first answer) from one model and uses it in request to both models. It was expected that both chats are fully separated and first model answers used as a history in requests to first model, second model answers used as a history in requests to second model.

Actual Behavior:

Initially I noticed quite strange responses on second question, as it looks like one model "stole" answer from another model.
Than I changes models order (first <-> second) and situation wth answers also swapped.
So I checked what are requests coming to OpenAI API and notices that second requests is identical fr both models and contains answer from only one model.

Description

Bug Summary:
Chats with several models are not fully separate, answer from one models used to continue conversation with everal models.

Reproduction Details

Steps to Reproduce:
like described above, no much can be added there

Logs and Screenshots

Browser Console Logs:
not applicable

Docker Container Logs:
not applicable

Screenshots/Screen Recordings (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Additional Information

The chat history doesn't contain full information about what is send to model on second request and how historical answers added inti second (third, ...) request. So chat logs doesn't help.

Originally created by @vlsav on GitHub (Aug 19, 2024). # Bug Report ## Installation Method docker ## Environment v0.3.13 no Ollama, two openai compatible endpoints Linux, RedHat Firefox 98.0 **Confirmation:** - [ X ] I have read and followed all the instructions provided in the README.md. - [ X ] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. - [ X ] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: Chatting with two models at once, second request takes history (first answer) from one model and uses it in request to both models. It was expected that both chats are fully separated and first model answers used as a history in requests to first model, second model answers used as a history in requests to second model. ## Actual Behavior: Initially I noticed quite strange responses on second question, as it looks like one model "stole" answer from another model. Than I changes models order (first <-> second) and situation wth answers also swapped. So I checked what are requests coming to OpenAI API and notices that second requests is identical fr both models and contains answer from only one model. ## Description **Bug Summary:** Chats with several models are not fully separate, answer from one models used to continue conversation with everal models. ## Reproduction Details **Steps to Reproduce:** like described above, no much can be added there ## Logs and Screenshots **Browser Console Logs:** not applicable **Docker Container Logs:** not applicable **Screenshots/Screen Recordings (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Additional Information The chat history doesn't contain full information about what is send to model on second request and how historical answers added inti second (third, ...) request. So chat logs doesn't help.
Author
Owner

@tjbck commented on GitHub (Aug 19, 2024):

Fixed on dev. Please check for duplicate issues before creating one, Thanks.

@tjbck commented on GitHub (Aug 19, 2024): Fixed on dev. Please check for duplicate issues before creating one, Thanks.
Author
Owner

@vlsav commented on GitHub (Aug 19, 2024):

@tjbck sorry, didn't find it. thank you@

@vlsav commented on GitHub (Aug 19, 2024): @tjbck sorry, didn't find it. thank you@
Author
Owner

@ruschestor commented on GitHub (Aug 20, 2024):

Fixed on dev. Please check for duplicate issues before creating one, Thanks.
@tjbck, could you please share a link to the original issue? This is also applies to me.
Thank you in advance!

@ruschestor commented on GitHub (Aug 20, 2024): > Fixed on dev. Please check for duplicate issues before creating one, Thanks. @tjbck, could you please share a link to the original issue? This is also applies to me. Thank you in advance!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1821