[PR #20338] [CLOSED] feat: persist filter-modified model ID to message metadata #41204

Closed
opened 2026-04-25 13:29:27 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/open-webui/open-webui/pull/20338
Author: @Classic298
Created: 1/2/2026
Status: Closed

Base: devHead: filter-access-adjustment


📝 Commits (2)

📊 Changes

2 files changed (+8 additions, -1 deletions)

View changed files

📝 backend/open_webui/main.py (+1 -1)
📝 src/lib/components/chat/Chat.svelte (+7 -0)

📄 Description

Summary

RELATED / FIXES: https://github.com/open-webui/open-webui/discussions/20294

When a filter function modifies the model in its inlet hook to route messages to a different model, the database now correctly stores the updated model ID. Filters can also emit a chat:model event to update the UI in real-time.

Problem

The model_id variable was captured at the start of the chat_completion function, but the database save was using this pre-captured value instead of the potentially-modified value from form_data after inlet filters run.

Before:

  1. User selects Model A
  2. Inlet filter changes model to Model B
  3. Model B processes the message
  4. Database stores model: Model A (wrong)
  5. UI shows Model A

After this feature introduction:

  1. User selects Model A
  2. Inlet filter changes model to Model B
  3. Model B processes the message
  4. Database stores model: Model B (correct)
  5. Filter emits chat:model event, UI shows Model B

Use Case: Model Routing in Filters

Filter authors can now implement model routing that persists correctly:

"""
title: Model Router Test Filter
author: Test
version: 0.2.0
description: Routes to gpt-5.2 when "this is a hard task" is in the message
"""

from pydantic import BaseModel, Field
from typing import Optional


class Filter:
    class Valves(BaseModel):
        priority: int = Field(default=0, description="Filter priority")
        target_model: str = Field(
            default="gpt-5.2", description="Model to route hard tasks to"
        )

    def __init__(self):
        self.valves = self.Valves()

    async def inlet(
        self, body: dict, __event_emitter__, __user__: Optional[dict] = None
    ) -> dict:
        # Get the last user message
        messages = body.get("messages", [])
        if not messages:
            return body

        last_message = messages[-1].get("content", "")

        # Check for trigger phrase (case-insensitive)
        if "this is a hard task" in last_message.lower():
            new_model = self.valves.target_model
            print(f"[Model Router] Routing to {new_model}")
            body["model"] = new_model

            # Emit event to update frontend UI with new model
            await __event_emitter__(
                {
                    "type": "chat:model",
                    "data": {"model": new_model},
                }
            )

        return body

The selected model will now be correctly displayed in the chat UI after the response.

Contributor License Agreement

By submitting this pull request, I confirm that I have read and fully agree to the Contributor License Agreement (CLA), and I am providing my contributions under its terms.

Note

Deleting the CLA section will lead to immediate closure of your PR and it will not be merged in.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/open-webui/open-webui/pull/20338 **Author:** [@Classic298](https://github.com/Classic298) **Created:** 1/2/2026 **Status:** ❌ Closed **Base:** `dev` ← **Head:** `filter-access-adjustment` --- ### 📝 Commits (2) - [`04b03d4`](https://github.com/open-webui/open-webui/commit/04b03d4a8d368250c85f246b8eb49d38cf371873) Update main.py - [`25f3115`](https://github.com/open-webui/open-webui/commit/25f3115c30fd4b1e0aba5764ac0250ca9b15e420) add event emitter ### 📊 Changes **2 files changed** (+8 additions, -1 deletions) <details> <summary>View changed files</summary> 📝 `backend/open_webui/main.py` (+1 -1) 📝 `src/lib/components/chat/Chat.svelte` (+7 -0) </details> ### 📄 Description ## Summary RELATED / FIXES: https://github.com/open-webui/open-webui/discussions/20294 When a filter function modifies the model in its inlet hook to route messages to a different model, the database now correctly stores the updated model ID. Filters can also emit a chat:model event to update the UI in real-time. ## Problem The model_id variable was captured at the start of the chat_completion function, but the database save was using this pre-captured value instead of the potentially-modified value from form_data after inlet filters run. **Before:** 1. User selects Model A 2. Inlet filter changes model to Model B 3. Model B processes the message 4. Database stores model: Model A (wrong) 5. UI shows Model A **After this feature introduction:** 1. User selects Model A 2. Inlet filter changes model to Model B 3. Model B processes the message 4. Database stores model: Model B (correct) 5. Filter emits chat:model event, UI shows Model B # Use Case: Model Routing in Filters Filter authors can now implement model routing that persists correctly: ``` """ title: Model Router Test Filter author: Test version: 0.2.0 description: Routes to gpt-5.2 when "this is a hard task" is in the message """ from pydantic import BaseModel, Field from typing import Optional class Filter: class Valves(BaseModel): priority: int = Field(default=0, description="Filter priority") target_model: str = Field( default="gpt-5.2", description="Model to route hard tasks to" ) def __init__(self): self.valves = self.Valves() async def inlet( self, body: dict, __event_emitter__, __user__: Optional[dict] = None ) -> dict: # Get the last user message messages = body.get("messages", []) if not messages: return body last_message = messages[-1].get("content", "") # Check for trigger phrase (case-insensitive) if "this is a hard task" in last_message.lower(): new_model = self.valves.target_model print(f"[Model Router] Routing to {new_model}") body["model"] = new_model # Emit event to update frontend UI with new model await __event_emitter__( { "type": "chat:model", "data": {"model": new_model}, } ) return body ``` The selected model will now be correctly displayed in the chat UI after the response. ### Contributor License Agreement By submitting this pull request, I confirm that I have read and fully agree to the [Contributor License Agreement (CLA)](https://github.com/open-webui/open-webui/blob/main/CONTRIBUTOR_LICENSE_AGREEMENT), and I am providing my contributions under its terms. > [!NOTE] > Deleting the CLA section will lead to immediate closure of your PR and it will not be merged in. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-25 13:29:27 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#41204