issue: JSON.parse error in OpenWebUI #6337

Closed
opened 2025-11-11 16:51:52 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @ZahraDehghani99 on GitHub (Sep 7, 2025).

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.6.26

Ollama Version (if applicable)

No response

Operating System

Debian 12

Browser (if applicable)

Firefox

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

The LLM response should be displayed correctly in the OpenWebUI chat interface when running behind a reverse proxy (Caddy in my case).

Actual Behavior

The LLM response is generated successfully on the backend (verified in logs), but the UI fails to render it and throws the following error:

JSON.parse: unexpected character at line 1 column 1 of the JSON data

Instead of showing the answer, the message bubble is empty with the error.

Steps to Reproduce

  1. Run OpenWebUI backend with a chatbot system.
  2. Configure reverse proxy with Caddy (using .local domain).
  3. Increase timeouts in Caddy (read_timeout, write_timeout, idle_timeout).
  4. Send a prompt to the LLM.
  5. Observe that the backend returns a response in logs, but the UI fails with JSON.parse error.

Logs & Screenshots

Image

Openwebui container logs:

open-webui | 2025-09-07 10:21:35.363 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /c/e24e41e6-575d-4d16-a96c-c207ded9e56c HTTP/1.1" 200
open-webui | 2025-09-07 10:21:35.442 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /static/loader.js HTTP/1.1" 200
open-webui | 2025-09-07 10:21:35.446 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /static/custom.css HTTP/1.1" 304
open-webui | 2025-09-07 10:21:35.495 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /static/splash.png HTTP/1.1" 304
open-webui | 2025-09-07 10:21:35.498 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /static/splash-dark.png HTTP/1.1" 304
open-webui | 2025-09-07 10:21:36.280 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/config HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.325 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/auths/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.350 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/config HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.503 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/changelog HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.505 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/archived?page=1&order_by=updated_at&direction=desc HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.521 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/channels/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.522 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.523 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /static/favicon.png HTTP/1.1" 304
open-webui | 2025-09-07 10:21:36.539 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/users/user/settings HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.550 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.578 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/all/tags HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.590 | INFO | open_webui.routers.openai:get_all_models:406 - get_all_models()
open-webui | 2025-09-07 10:21:36.618 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/models HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.638 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/pinned HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.646 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/configs/banners HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.684 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.685 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/tools/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.711 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.749 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/e24e41e6-575d-4d16-a96c-c207ded9e56c HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.763 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.917 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/e24e41e6-575d-4d16-a96c-c207ded9e56c/tags HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.955 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/users/user/settings HTTP/1.1" 200
open-webui | 2025-09-07 10:21:36.977 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/tasks/chat/e24e41e6-575d-4d16-a96c-c207ded9e56c HTTP/1.1" 200
open-webui | 2025-09-07 10:21:37.038 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /ollama/api/version HTTP/1.1" 200
open-webui | 2025-09-07 10:21:37.104 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/auths/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:37.271 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/version/updates HTTP/1.1" 200
open-webui | 2025-09-07 10:21:38.678 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /ollama/api/version HTTP/1.1" 200
open-webui | 2025-09-07 10:21:38.751 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/users/user/settings HTTP/1.1" 200
open-webui | 2025-09-07 10:21:42.517 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/auths/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:42.532 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/auths/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:42.549 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "POST /api/v1/chats/new HTTP/1.1" 200
open-webui | 2025-09-07 10:21:42.620 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200
open-webui | 2025-09-07 10:21:42.650 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:42.668 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "POST /api/v1/chats/c1372e8e-a705-4e7a-a1af-e3416dab3cd8 HTTP/1.1" 200
open-webui | 2025-09-07 10:21:42.700 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200
open-webui | 2025-09-07 10:21:42.739 | INFO | open_webui.routers.openai:get_all_models:406 - get_all_models()
open-webui | 2025-09-07 10:21:42.745 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:43.556 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "POST /api/chat/completions HTTP/1.1" 200
open-webui | 2025-09-07 10:21:43.625 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200
open-webui | 2025-09-07 10:21:43.645 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200
open-webui | 2025-09-07 10:21:44.155 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/79e1e69b-1482-4ea1-a269-4ebf988f8905 HTTP/1.1" 200
open-webui | 2025-09-07 10:21:44.192 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/11773e7d-83ff-4297-b2c7-d34d86267c97 HTTP/1.1" 200

Additional Information

Here is my caddy configuration which worked a few hours ago:

openwebui.local {
tls internal
encode gzip

# Proxy all OpenAI/chatbot API requests to your Flask backend
handle_path /v1/* {
    reverse_proxy 127.0.0.1:8000 {
        transport http {
            read_timeout 300s
            write_timeout 300s
        }
    }
}

# All other paths go to OpenWebUI
handle {
    reverse_proxy 127.0.0.1:8080 {
        transport http {
            read_timeout 300s
            write_timeout 300s
        }
    }
}

}

Originally created by @ZahraDehghani99 on GitHub (Sep 7, 2025). ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.6.26 ### Ollama Version (if applicable) _No response_ ### Operating System Debian 12 ### Browser (if applicable) Firefox ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior The LLM response should be displayed correctly in the OpenWebUI chat interface when running behind a reverse proxy (Caddy in my case). ### Actual Behavior The LLM response is generated successfully on the backend (verified in logs), but the UI fails to render it and throws the following error: ``` JSON.parse: unexpected character at line 1 column 1 of the JSON data ``` Instead of showing the answer, the message bubble is empty with the error. ### Steps to Reproduce 1. Run OpenWebUI backend with a chatbot system. 2. Configure reverse proxy with **Caddy** (using `.local` domain). 3. Increase timeouts in Caddy (`read_timeout`, `write_timeout`, `idle_timeout`). 4. Send a prompt to the LLM. 5. Observe that the backend returns a response in logs, but the UI fails with JSON.parse error. ### Logs & Screenshots <img width="1315" height="742" alt="Image" src="https://github.com/user-attachments/assets/e3a7c634-eedb-486c-b5b6-5d391eb82cfd" /> Openwebui container logs: open-webui | 2025-09-07 10:21:35.363 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /c/e24e41e6-575d-4d16-a96c-c207ded9e56c HTTP/1.1" 200 open-webui | 2025-09-07 10:21:35.442 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /static/loader.js HTTP/1.1" 200 open-webui | 2025-09-07 10:21:35.446 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /static/custom.css HTTP/1.1" 304 open-webui | 2025-09-07 10:21:35.495 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /static/splash.png HTTP/1.1" 304 open-webui | 2025-09-07 10:21:35.498 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /static/splash-dark.png HTTP/1.1" 304 open-webui | 2025-09-07 10:21:36.280 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/config HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.325 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/auths/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.350 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/config HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.503 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/changelog HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.505 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/archived?page=1&order_by=updated_at&direction=desc HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.521 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/channels/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.522 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.523 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /static/favicon.png HTTP/1.1" 304 open-webui | 2025-09-07 10:21:36.539 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/users/user/settings HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.550 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.578 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/all/tags HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.590 | INFO | open_webui.routers.openai:get_all_models:406 - get_all_models() open-webui | 2025-09-07 10:21:36.618 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/models HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.638 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/pinned HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.646 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/configs/banners HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.684 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.685 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/tools/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.711 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.749 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/e24e41e6-575d-4d16-a96c-c207ded9e56c HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.763 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.917 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/e24e41e6-575d-4d16-a96c-c207ded9e56c/tags HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.955 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/users/user/settings HTTP/1.1" 200 open-webui | 2025-09-07 10:21:36.977 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/tasks/chat/e24e41e6-575d-4d16-a96c-c207ded9e56c HTTP/1.1" 200 open-webui | 2025-09-07 10:21:37.038 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /ollama/api/version HTTP/1.1" 200 open-webui | 2025-09-07 10:21:37.104 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/auths/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:37.271 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/version/updates HTTP/1.1" 200 open-webui | 2025-09-07 10:21:38.678 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /ollama/api/version HTTP/1.1" 200 open-webui | 2025-09-07 10:21:38.751 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/users/user/settings HTTP/1.1" 200 open-webui | 2025-09-07 10:21:42.517 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/auths/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:42.532 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/auths/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:42.549 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "POST /api/v1/chats/new HTTP/1.1" 200 open-webui | 2025-09-07 10:21:42.620 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 open-webui | 2025-09-07 10:21:42.650 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:42.668 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "POST /api/v1/chats/c1372e8e-a705-4e7a-a1af-e3416dab3cd8 HTTP/1.1" 200 open-webui | 2025-09-07 10:21:42.700 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 open-webui | 2025-09-07 10:21:42.739 | INFO | open_webui.routers.openai:get_all_models:406 - get_all_models() open-webui | 2025-09-07 10:21:42.745 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:43.556 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "POST /api/chat/completions HTTP/1.1" 200 open-webui | 2025-09-07 10:21:43.625 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 open-webui | 2025-09-07 10:21:43.645 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/folders/ HTTP/1.1" 200 open-webui | 2025-09-07 10:21:44.155 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/79e1e69b-1482-4ea1-a269-4ebf988f8905 HTTP/1.1" 200 open-webui | 2025-09-07 10:21:44.192 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 172.25.7.18:0 - "GET /api/v1/chats/11773e7d-83ff-4297-b2c7-d34d86267c97 HTTP/1.1" 200 ### Additional Information Here is my caddy configuration which worked a few hours ago: openwebui.local { tls internal encode gzip # Proxy all OpenAI/chatbot API requests to your Flask backend handle_path /v1/* { reverse_proxy 127.0.0.1:8000 { transport http { read_timeout 300s write_timeout 300s } } } # All other paths go to OpenWebUI handle { reverse_proxy 127.0.0.1:8080 { transport http { read_timeout 300s write_timeout 300s } } } }
GiteaMirror added the bug label 2025-11-11 16:51:52 -06:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#6337