[GH-ISSUE #11111] Chat render incorrectly injects attached filename into markdown when an array reference is in the text #31642

Closed
opened 2026-04-25 05:33:06 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @allegedalohomora on GitHub (Mar 3, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/11111

Originally assigned to: @tjbck on GitHub.

Bug Report

Installation Method

Installation via Docker (using these instructions: https://shawnhoover.dev/notes/home-ai-server.html)

Environment

  • Open WebUI Version: - 0.5.18

  • Ollama (if applicable): N/A

  • Operating System: Windows 11 Pro

  • Browser (if applicable): Chrome - Version 133.0.6943.142 (Official Build) (64-bit)

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

Chat correctly renders the markdown supplied from the LLM

Actual Behavior:

When the following condition holds:

  1. A file has been uploaded into the context or made available via a "Workspace"
  2. LLM-supplied markdown contains an array reference - e.g. [0]

Then the rendering engine seems to inject the file reference in place of the array reference in the resulting HTML.

Bug Summary:
Rendering engine seems to inject file context data into the user response in some scenarios based on the structure of markdown content.

Reproduction Details

Steps to Reproduce:

  1. Start a new chat.
  2. Ask the LLM to do some work - e.g.
    Reformat this string: STATUS=$(aws ssm list-commands —command-id “$COMMAND_ID” —query “ommands[0].Status” —output text)
  3. Now add a file into the context (via upload or # syntax to add a workspace context)
  4. Repeat the ask to the LLM.
  5. Observe that the response has had the [0] in the string replaced with:
    <source_id data="0" title="...name of file in context referenced..." />

Logs and Screenshots

Browser Console Logs:
console-log-localhost-1740782433165.log

Docker Container Logs:
owebui_docker_log.txt

Screenshots/Screen Recordings (if applicable):
Image

Originally created by @allegedalohomora on GitHub (Mar 3, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/11111 Originally assigned to: @tjbck on GitHub. # Bug Report ## Installation Method Installation via Docker (using these instructions: https://shawnhoover.dev/notes/home-ai-server.html) ## Environment - **Open WebUI Version:** - 0.5.18 - **Ollama (if applicable):** N/A - **Operating System:** Windows 11 Pro - **Browser (if applicable):** Chrome - Version 133.0.6943.142 (Official Build) (64-bit) **Confirmation:** - [X] I have read and followed all the instructions provided in the README.md. - [X] I am on the latest version of both Open WebUI and Ollama. - [X] I have included the browser console logs. - [X] I have included the Docker container logs. - [X] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: Chat correctly renders the markdown supplied from the LLM ## Actual Behavior: When the following condition holds: 1. A file has been uploaded into the context or made available via a "Workspace" 2. LLM-supplied markdown contains an array reference - e.g. [0] Then the rendering engine seems to inject the file reference in place of the array reference in the resulting HTML. **Bug Summary:** Rendering engine seems to inject file context data into the user response in some scenarios based on the structure of markdown content. ## Reproduction Details **Steps to Reproduce:** 1. Start a new chat. 2. Ask the LLM to do some work - e.g. Reformat this string: STATUS=$(aws ssm list-commands —command-id “$COMMAND_ID” —query “ommands[0].Status” —output text) 3. Now add a file into the context (via upload or # syntax to add a workspace context) 4. Repeat the ask to the LLM. 5. Observe that the response has had the [0] in the string replaced with: <source_id data="0" title="...name of file in context referenced..." /> ## Logs and Screenshots **Browser Console Logs:** [console-log-localhost-1740782433165.log](https://github.com/user-attachments/files/19061904/console-log-localhost-1740782433165.log) **Docker Container Logs:** [owebui_docker_log.txt](https://github.com/user-attachments/files/19061908/owebui_docker_log.txt) **Screenshots/Screen Recordings (if applicable):** ![Image](https://github.com/user-attachments/assets/328d671e-614d-45bb-996f-50d5e268c62b)
Author
Owner

@tjbck commented on GitHub (Mar 4, 2025):

Addressed in dev with d844fc7edb

Image

<!-- gh-comment-id:2696119997 --> @tjbck commented on GitHub (Mar 4, 2025): Addressed in dev with d844fc7edb6314bf2422ea1b8bbd1c360f694f2e ![Image](https://github.com/user-attachments/assets/93e6cb85-62b5-4864-9275-118c1874ac3b)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#31642