[GH-ISSUE #24095] Bug: Title/tag generation fails with TypeError in background_tasks_handler for direct connection chats #58850

Open
opened 2026-05-06 00:16:56 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @champumbc on GitHub (Apr 24, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/24095

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker (ECS Fargate)

Open WebUI Version

v0.9.2

Ollama Version (if applicable)

N/A

Operating System

Linux (AWS Fargate ARM64)

Browser (if applicable)

All browsers

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation.

This is a companion bug to #24092. That issue covers the frontend-initiated HTTP endpoint path (POST /api/v1/tasks/title/completions). This issue covers the internal middleware path where background_tasks_handler triggers title/tag generation automatically after a chat completion.

Expected Behavior

After a chat completion finishes via a direct connection model, the middleware's background_tasks_handler should generate a title and tags using the configured Task Model (e.g., "Amazon Nova Micro" configured in Admin > Settings), making a server-side API call.

Actual Behavior

Title and tag generation crashes with TypeError: 'NoneType' object is not callable. The chat title falls back to displaying the user's first message instead of a generated title.

Full traceback from server logs:

middleware.py:5020 → await background_tasks_handler(ctx)
  middleware.py:3112 → res = await generate_title(request, {...}, user)
    tasks.py:220     → return await generate_direct_chat_completion(request, form_data, user=user, models=models)
      chat.py:140    → res = await event_caller({...})
TypeError: 'NoneType' object is not callable

Root Cause

There are two compounding issues:

1. request.state.direct persists into background task generation:

When background_tasks_handler (middleware.py:3001) calls generate_title(request, ...), it passes the original request object from the chat completion. This request still has request.state.direct = True set from the user's direct connection chat.

In generate_title (tasks.py:~160), this causes:

if getattr(request.state, 'direct', False) and hasattr(request.state, 'model'):
    models = {request.state.model['id']: request.state.model}  # Only the direct connection model

2. Task model override fails because models dict is too narrow:

get_task_model_id (utils/task.py) tries to find the configured Task Model ("Amazon Nova Micro") in the models dict, but models only contains the direct connection model (e.g., "Claude Haiku 4.5"). The override never happens:

def get_task_model_id(default_model_id, task_model, task_model_external, models):
    task_model_id = default_model_id
    if models.get(task_model_id, {}).get('connection_type') == 'local':
        if task_model and task_model in models:  # "Amazon Nova Micro" NOT in models
            task_model_id = task_model
    else:
        if task_model_external and task_model_external in models:  # Also NOT in models
            task_model_id = task_model_external
    return task_model_id  # Returns "Claude Haiku 4.5" unchanged

3. Direct completion path crashes:

Since request.state.direct is True, generate_chat_completion (chat.py:196) routes to generate_direct_chat_completion. This function calls event_caller = await get_event_call(metadata) (chat.py:75), which returns None because the background task context doesn't have a valid WebSocket session. Then res = await event_caller({...}) at line 140 crashes.

Steps to Reproduce

  1. Deploy Open WebUI with ENABLE_DIRECT_CONNECTIONS=true and ENABLE_OLLAMA_API=False
  2. Configure a server-level OpenAI connection with at least one model (e.g., "Amazon Nova Micro")
  3. Set Task Model (Local) and Task Model (External) to "Amazon Nova Micro" in Admin > Settings
  4. As a user, add a direct connection (e.g., with access to "Claude Haiku 4.5")
  5. Start a new chat using the direct connection model
  6. Send a message — the chat works, but:
    • The auto-generated title falls back to the user's first message text
    • Tags are not generated
    • Server logs show the TypeError traceback

Suggested Fix

The background_tasks_handler should ensure that task generation uses the server-side path when the resolved task model is a server-level model:

Option A: Clear request.state.direct before calling task functions:

async def background_tasks_handler(ctx):
    request = ctx['request']
    # Task generation should use the server-side task model,
    # not the user's direct connection
    original_direct = getattr(request.state, 'direct', False)
    request.state.direct = False
    try:
        # ... existing title/tag/follow-up generation code ...
    finally:
        request.state.direct = original_direct

Option B: Have get_task_model_id fall back to request.app.state.MODELS when the configured task model isn't in the narrow direct-connection models dict.

Option C: Have generate_title/generate_chat_tags check whether the resolved task_model_id is a server-side model and explicitly use request.app.state.MODELS for the completion call in that case.

Option A is the simplest and most targeted fix.

Environment

  • ENABLE_DIRECT_CONNECTIONS=true
  • ENABLE_OLLAMA_API=False
  • Server-level OpenAI connection configured
  • Task Model set to a server-level model in Admin Settings
  • Users rely on direct connections for chat
Originally created by @champumbc on GitHub (Apr 24, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/24095 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker (ECS Fargate) ### Open WebUI Version v0.9.2 ### Ollama Version (if applicable) N/A ### Operating System Linux (AWS Fargate ARM64) ### Browser (if applicable) All browsers ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. ### Related Issue This is a companion bug to #24092. That issue covers the **frontend-initiated HTTP endpoint path** (`POST /api/v1/tasks/title/completions`). This issue covers the **internal middleware path** where `background_tasks_handler` triggers title/tag generation automatically after a chat completion. ### Expected Behavior After a chat completion finishes via a direct connection model, the middleware's `background_tasks_handler` should generate a title and tags using the configured Task Model (e.g., "Amazon Nova Micro" configured in Admin > Settings), making a server-side API call. ### Actual Behavior Title and tag generation crashes with `TypeError: 'NoneType' object is not callable`. The chat title falls back to displaying the user's first message instead of a generated title. **Full traceback from server logs:** ``` middleware.py:5020 → await background_tasks_handler(ctx) middleware.py:3112 → res = await generate_title(request, {...}, user) tasks.py:220 → return await generate_direct_chat_completion(request, form_data, user=user, models=models) chat.py:140 → res = await event_caller({...}) TypeError: 'NoneType' object is not callable ``` ### Root Cause There are two compounding issues: **1. `request.state.direct` persists into background task generation:** When `background_tasks_handler` (middleware.py:3001) calls `generate_title(request, ...)`, it passes the original request object from the chat completion. This request still has `request.state.direct = True` set from the user's direct connection chat. In `generate_title` (tasks.py:~160), this causes: ```python if getattr(request.state, 'direct', False) and hasattr(request.state, 'model'): models = {request.state.model['id']: request.state.model} # Only the direct connection model ``` **2. Task model override fails because models dict is too narrow:** `get_task_model_id` (utils/task.py) tries to find the configured Task Model ("Amazon Nova Micro") in the `models` dict, but `models` only contains the direct connection model (e.g., "Claude Haiku 4.5"). The override never happens: ```python def get_task_model_id(default_model_id, task_model, task_model_external, models): task_model_id = default_model_id if models.get(task_model_id, {}).get('connection_type') == 'local': if task_model and task_model in models: # "Amazon Nova Micro" NOT in models task_model_id = task_model else: if task_model_external and task_model_external in models: # Also NOT in models task_model_id = task_model_external return task_model_id # Returns "Claude Haiku 4.5" unchanged ``` **3. Direct completion path crashes:** Since `request.state.direct` is True, `generate_chat_completion` (chat.py:196) routes to `generate_direct_chat_completion`. This function calls `event_caller = await get_event_call(metadata)` (chat.py:75), which returns `None` because the background task context doesn't have a valid WebSocket session. Then `res = await event_caller({...})` at line 140 crashes. ### Steps to Reproduce 1. Deploy Open WebUI with `ENABLE_DIRECT_CONNECTIONS=true` and `ENABLE_OLLAMA_API=False` 2. Configure a server-level OpenAI connection with at least one model (e.g., "Amazon Nova Micro") 3. Set Task Model (Local) and Task Model (External) to "Amazon Nova Micro" in Admin > Settings 4. As a user, add a direct connection (e.g., with access to "Claude Haiku 4.5") 5. Start a **new chat** using the direct connection model 6. Send a message — the chat works, but: - The auto-generated title falls back to the user's first message text - Tags are not generated - Server logs show the TypeError traceback ### Suggested Fix The `background_tasks_handler` should ensure that task generation uses the **server-side path** when the resolved task model is a server-level model: **Option A: Clear `request.state.direct` before calling task functions:** ```python async def background_tasks_handler(ctx): request = ctx['request'] # Task generation should use the server-side task model, # not the user's direct connection original_direct = getattr(request.state, 'direct', False) request.state.direct = False try: # ... existing title/tag/follow-up generation code ... finally: request.state.direct = original_direct ``` **Option B: Have `get_task_model_id` fall back to `request.app.state.MODELS`** when the configured task model isn't in the narrow direct-connection models dict. **Option C: Have `generate_title`/`generate_chat_tags` check whether the resolved `task_model_id` is a server-side model** and explicitly use `request.app.state.MODELS` for the completion call in that case. Option A is the simplest and most targeted fix. ### Environment - `ENABLE_DIRECT_CONNECTIONS=true` - `ENABLE_OLLAMA_API=False` - Server-level OpenAI connection configured - Task Model set to a server-level model in Admin Settings - Users rely on direct connections for chat
Author
Owner

@champumbc commented on GitHub (Apr 24, 2026):

In full disclosure, again, I did use Claude Code to help write and file the issue - but the problem is confirmed by a real human, me.

<!-- gh-comment-id:4315041837 --> @champumbc commented on GitHub (Apr 24, 2026): In full disclosure, again, I did use Claude Code to help write and file the issue - but the problem is confirmed by a real human, me.
Author
Owner

@champumbc commented on GitHub (Apr 24, 2026):

PR with fix: https://github.com/open-webui/open-webui/pull/24099

<!-- gh-comment-id:4315185371 --> @champumbc commented on GitHub (Apr 24, 2026): PR with fix: https://github.com/open-webui/open-webui/pull/24099
Author
Owner

@champumbc commented on GitHub (Apr 24, 2026):

PR with fix: https://github.com/open-webui/open-webui/pull/24100 (replaces #24099 which had wrong base branch)

<!-- gh-comment-id:4315224033 --> @champumbc commented on GitHub (Apr 24, 2026): PR with fix: https://github.com/open-webui/open-webui/pull/24100 (replaces #24099 which had wrong base branch)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#58850