issue: v0.6.33 - Previous reply not sent with prompt #6633

Closed
opened 2025-11-11 17:01:52 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @frenzybiscuit on GitHub (Oct 8, 2025).

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

0.6.33

Ollama Version (if applicable)

No response

Operating System

Debian 12

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

Works as expected

Actual Behavior

Currently, it doesn't appear to be sending previous replies with (this specific) prompt/model.

I have debug mode enabled on the LLM backend, and the LLM backend on the left of the image shows the issue. The very last line changes, but there is no previous context sent.

Steps to Reproduce

I have no idea

Logs & Screenshots

Image Image Image Image

Additional Information

.

Originally created by @frenzybiscuit on GitHub (Oct 8, 2025). ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version 0.6.33 ### Ollama Version (if applicable) _No response_ ### Operating System Debian 12 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior Works as expected ### Actual Behavior Currently, it doesn't appear to be sending previous replies with (this specific) prompt/model. I have debug mode enabled on the LLM backend, and the LLM backend on the left of the image shows the issue. The very last line changes, but there is no previous context sent. ### Steps to Reproduce I have no idea ### Logs & Screenshots <img width="2038" height="1660" alt="Image" src="https://github.com/user-attachments/assets/8cd9dc16-33cb-4a77-8629-e87ca89aaae4" /> <img width="2734" height="2348" alt="Image" src="https://github.com/user-attachments/assets/f2c2f4cc-d146-4cf2-beaf-562b90c80e60" /> <img width="3536" height="814" alt="Image" src="https://github.com/user-attachments/assets/f9547b52-e0fc-47a4-ab22-801721510809" /> <img width="3536" height="814" alt="Image" src="https://github.com/user-attachments/assets/0ac0dd30-bd44-4fd0-9b77-7422f863ec75" /> ### Additional Information .
GiteaMirror added the bug label 2025-11-11 17:01:52 -06:00
Author
Owner

@frenzybiscuit commented on GitHub (Oct 8, 2025):

It was the chat template, closing.

@frenzybiscuit commented on GitHub (Oct 8, 2025): It was the chat template, closing.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#6633