issue: Issues with continue response! #4611

Closed
opened 2025-11-11 15:58:35 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @frenzybiscuit on GitHub (Mar 30, 2025).

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

0.5.20

Ollama Version (if applicable)

0.6.3

Operating System

Debian 12

Browser (if applicable)

firefox

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

Works correctly

Actual Behavior

So two issues:

A) When using continue reply on the docker ollama model (Qwen 2.5 0.5B) nothing happens 95% of the time and the remaining 5% of the time it works correctly (have to push the button several times).

B) When using continue reply on the TabbyAPI OpenAPI model, it works, but it's not a direct continuation of the previous comment. For example, if the model cuts off a response because it hits the max token limit, it will continue the reply but it is not a direct continuation of the previous response.

Steps to Reproduce

Described above

Logs & Screenshots

Additional Information

No response

Originally created by @frenzybiscuit on GitHub (Mar 30, 2025). ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version 0.5.20 ### Ollama Version (if applicable) 0.6.3 ### Operating System Debian 12 ### Browser (if applicable) firefox ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior Works correctly ### Actual Behavior So two issues: A) When using continue reply on the docker ollama model (Qwen 2.5 0.5B) nothing happens 95% of the time and the remaining 5% of the time it works correctly (have to push the button several times). B) When using continue reply on the TabbyAPI OpenAPI model, it works, but it's not a direct continuation of the previous comment. For example, if the model cuts off a response because it hits the max token limit, it will continue the reply but it is not a direct continuation of the previous response. ### Steps to Reproduce Described above ### Logs & Screenshots ### Additional Information _No response_
GiteaMirror added the bug label 2025-11-11 15:58:35 -06:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#4611