issue: filter pipelines - missing LLM outputs(outlet call) for title/follow up/tag generation requests #6361

Closed
opened 2025-11-11 16:52:37 -06:00 by GiteaMirror · 3 comments
Owner

Originally created by @frdeng on GitHub (Sep 10, 2025).

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Other

Open WebUI Version

v0.6.22

Ollama Version (if applicable)

No response

Operating System

k8s

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

Filter pipelines, it should have LLM outputs(outlet call) for title/follow up/tag generation requests

Actual Behavior

only inputs(inlet call) are observed for title/follow up/tag generation requests.

Steps to Reproduce

  1. deploy pipelines
  2. enable title/tag/followup generation
  3. add a filter pipelines
  4. monitor pipelines logs
  5. outlet calls are missing for title/tag/followup generation responses.

Logs & Screenshots

n/a

Additional Information

No response

Originally created by @frdeng on GitHub (Sep 10, 2025). ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Other ### Open WebUI Version v0.6.22 ### Ollama Version (if applicable) _No response_ ### Operating System k8s ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior Filter pipelines, it should have LLM outputs(outlet call) for title/follow up/tag generation requests ### Actual Behavior only inputs(inlet call) are observed for title/follow up/tag generation requests. ### Steps to Reproduce 1. deploy pipelines 2. enable title/tag/followup generation 3. add a filter pipelines 4. monitor pipelines logs 5. outlet calls are missing for title/tag/followup generation responses. ### Logs & Screenshots n/a ### Additional Information _No response_
GiteaMirror added the bug label 2025-11-11 16:52:37 -06:00
Author
Owner

@tjbck commented on GitHub (Sep 11, 2025):

outlet is only supported for main message io.

@tjbck commented on GitHub (Sep 11, 2025): `outlet` is only supported for main message io.
Author
Owner

@frdeng commented on GitHub (Sep 12, 2025):

if only the main message is supported, then don't put the followup/title/tag gen messages to inlet, otherwise the pipelines developers have to add the logic handle the missing outlet calls @tjbck

@frdeng commented on GitHub (Sep 12, 2025): if only the main message is supported, then don't put the followup/title/tag gen messages to inlet, otherwise the pipelines developers have to add the logic handle the missing outlet calls @tjbck
Author
Owner

@selenecodes commented on GitHub (Oct 20, 2025):

This kind of makes no sense to not have though?
Because right now we do log the inlets for tasks in:
7a83e7dfa3/backend/open_webui/routers/tasks.py (L372-L376)

Is there any objection for me to make a PR that adds the following:

  • Add LLM generation filter output to tasks
  • Upvotes/downvotes users give on message (evaluations). I would like to add at least an inlet and maybe an outlet so that we can log user evals into Langfuse
@selenecodes commented on GitHub (Oct 20, 2025): This kind of makes no sense to not have though? Because right now we do log the inlets for tasks in: https://github.com/open-webui/open-webui/blob/7a83e7dfa367d19f762ec17cac5e4a94ea2bd97d/backend/open_webui/routers/tasks.py#L372-L376 Is there any objection for me to make a PR that adds the following: - Add LLM generation filter output to tasks - Upvotes/downvotes users give on message (evaluations). I would like to add at least an inlet and maybe an outlet so that we can log user evals into Langfuse
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#6361