[GH-ISSUE #15279] issue: LLM streaming responses containing standalone ``` (triple backticks) are being stripped by backend middleware #33050

Closed
opened 2026-04-25 06:54:42 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @ai-poet on GitHub (Jun 25, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/15279

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

v0.6.15

Ollama Version (if applicable)

No response

Operating System

Ubuntu 22.04

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

LLM responses containing standalone ``` should be displayed as-is without modification from the python backend

Actual Behavior

Standalone ``` characters are stripped from the streaming response, resulting in missing content from the python backend

Steps to Reproduce

1.Use Open WebUI with the Chutes.ai chat completion api (https://chutes.ai)
2.Select DeepSeek V3 And Send a prompt about code that would cause the LLM to respond with standalone ``` in a streaming chunk
3.Check the browser console logs to confirm that ``` are not displayed and that they were not sent from the Python backend
4.This will cause rendering issues with code blocks on the frontend

Logs & Screenshots

{
"chat_id": "5250f0a7-716a-4f23-a11d-b54d9c564d28",
"message_id": "1c07ff87-6437-4e87-b2a4-d32846aff358",
"data": {
"type": "chat:completion",
"data": {
"done": true,
"content": "Here's a simple JavaScript code snippet that generates a random number between a specified range and displays it:\n\njavascript\n// Function to generate a random number between min and max (inclusive)\nfunction getRandomNumber(min, max) {\n return Math.floor(Math.random() * (max - min + 1)) + min;\n}\n\n// Example usage\nconst min = 1;\nconst max = 100;\nconst randomNumber = getRandomNumber(min, max);\n\nconsole.log(Random number between ${min} and ${max}: ${randomNumber});\n\n\nYou can run this code in a browser's console, a Node.js environment, or any JavaScript playground. Let me know if you'd like something more specific or complex!",
}
}
}

Additional Information

No response

Originally created by @ai-poet on GitHub (Jun 25, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/15279 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version v0.6.15 ### Ollama Version (if applicable) _No response_ ### Operating System Ubuntu 22.04 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior LLM responses containing standalone ``` should be displayed as-is without modification from the python backend ### Actual Behavior Standalone ``` characters are stripped from the streaming response, resulting in missing content from the python backend ### Steps to Reproduce 1.Use Open WebUI with the Chutes.ai chat completion api (https://chutes.ai) 2.Select DeepSeek V3 And Send a prompt about code that would cause the LLM to respond with standalone \`\`\` in a streaming chunk 3.Check the browser console logs to confirm that \`\`\` are not displayed and that they were not sent from the Python backend 4.This will cause rendering issues with code blocks on the frontend ### Logs & Screenshots { "chat_id": "5250f0a7-716a-4f23-a11d-b54d9c564d28", "message_id": "1c07ff87-6437-4e87-b2a4-d32846aff358", "data": { "type": "chat:completion", "data": { "done": true, "content": "Here's a simple JavaScript code snippet that generates a random number between a specified range and displays it:\n\njavascript\n// Function to generate a random number between min and max (inclusive)\nfunction getRandomNumber(min, max) {\n return Math.floor(Math.random() * (max - min + 1)) + min;\n}\n\n// Example usage\nconst min = 1;\nconst max = 100;\nconst randomNumber = getRandomNumber(min, max);\n\nconsole.log(`Random number between ${min} and ${max}: ${randomNumber}`);\n\n\nYou can run this code in a browser's console, a Node.js environment, or any JavaScript playground. Let me know if you'd like something more specific or complex!", } } } ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-25 06:54:42 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#33050