[GH-ISSUE #4491] All response are returning blank brackets #13633

Closed
opened 2026-04-19 20:18:17 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @hanjin66 on GitHub (Aug 9, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/4491

Bug Report

Installation Method

[Describe the method you used to install the project, e.g., git clone, Docker, pip, etc.]
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:latest

Environment

WSL

  • Open WebUI Version:
    v0.3.12
  • Ollama (if applicable): [e.g., v0.2.0, v0.1.32-rc1]
    ollama version is 0.1.44
  • Operating System:
    PRETTY_NAME="Ubuntu 22.04.4 LTS"
    NAME="Ubuntu"
    VERSION_ID="22.04"
    VERSION="22.04.4 LTS (Jammy Jellyfish)"
    VERSION_CODENAME=jammy
    ID=ubuntu
    ID_LIKE=debian
  • Browser (if applicable): [e.g., Chrome 100.0, Firefox 98.0]

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

When I say hello it should have a proper response, the modal works fine because I can boot up the ollama modal and get a proper response, but when I use openUI its giving me back blank brackets
[Describe what you expected to happen.]

Actual Behavior:

When I ask anything it returns empty brackets
[Describe what actually happened.]
Used to work just fine before the 8-7-2024 version update, it was working fine yesterday, today I got the latest version and it stopped working, everything return blank brackets

Description

Bug Summary:
[Provide a brief but clear summary of the bug]
All response are returning blank brackets, when I type Hello, the response is a blank {}

Reproduction Details

Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]

Logs and Screenshots

To create a public link, set share=True in launch().
Startup time: 44.4s (prepare environment: 39.0s, import torch: 2.1s, import gradio: 0.5s, setup paths: 0.6s, initialize shared: 0.2s, other imports: 0.3s, list SD models: 0.1s, load scripts: 0.6s, create ui: 0.5s, gradio launch: 0.1s, add APIs: 0.2s).
Creating model from config: /home/mylinux/stablediff/stable-diffusion-webui/configs/v1-inference.yaml
/home/mylinux/stablediff/stable-diffusion-webui/venv/lib/python3.10/site-packages/huggingface_hub/file_download.py:1150: FutureWarning: resume_download is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use force_download=True.
warnings.warn(
Applying attention optimization: Doggettx... done.
Model loaded in 14.5s (load weights from disk: 1.0s, create model: 0.4s, apply weights to model: 12.6s, apply dtype to VAE: 0.1s, calculate empty prompt: 0.3s).
Browser Console Logs:
[Include relevant browser console logs, if applicable]
controller AbortController {signal: AbortSignal}signal: AbortSignalaborted: falseonabort: nullreason: undefinedPrototype: AbortSignalPrototype: AbortController
Chat.svelte:847 {"model":"llama3.1:latest","created_at":"2024-08-09T05:41:05.205502016Z","message":{"role":"assistant","content":"{}"},"done":false}
Chat.svelte:847 {"model":"llama3.1:latest","created_at":"2024-08-09T05:41:05.246354509Z","message":{"role":"assistant","content":""},"done_reason":"stop","done":true,"total_duration":514822104,"load_duration":93065336,"prompt_eval_count":10,"prompt_eval_duration":309113000,"eval_count":2,"eval_duration":41858000}
+layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest']Prototype: Object
+layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest']Prototype: Object
+layout.svelte:130 usage {models: Array(1)}models: Array(1)0: "llama3.1:latest"length: 1Prototype: Array(0)Prototype: Object
+layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest']Prototype: Object
+layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest']Prototype: Object
+layout.svelte:130 usage {models: Array(0)}
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
glibc version is 2.35
Check TCMalloc: libtcmalloc_minimal.so.4
libtcmalloc_minimal.so.4 is linked with libc.so,execute LD_PRELOAD=/lib/x86_64-linux-gnu/libtcmalloc_minimal.so.4
Python 3.10.6 (main, Oct 24 2022, 16:07:47) [GCC 11.2.0]
Version: v1.10.1
Commit hash: 82a973c04367123ae98bd9abdf80d9eda9b910e2
Installing clip
Installing open_clip
Installing requirements
Launching Web UI with arguments: --listen --api
no module 'xformers'. Processing without...
no module 'xformers'. Processing without...
No module 'xformers'. Proceeding without it.
Loading weights [6ce0161689] from /home/mylinux/stablediff/stable-diffusion-webui/models/Stable-diffusion/v1-5-pruned-emaonly.safetensors
Screenshots/Screen Recordings (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
image

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @hanjin66 on GitHub (Aug 9, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/4491 # Bug Report ## Installation Method [Describe the method you used to install the project, e.g., git clone, Docker, pip, etc.] docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always [ghcr.io/open-webui/open-webui:latest](http://ghcr.io/open-webui/open-webui:latest) ## Environment WSL - **Open WebUI Version:** v0.3.12 - **Ollama (if applicable):** [e.g., v0.2.0, v0.1.32-rc1] ollama version is 0.1.44 - **Operating System:** PRETTY_NAME="Ubuntu 22.04.4 LTS" NAME="Ubuntu" VERSION_ID="22.04" VERSION="22.04.4 LTS (Jammy Jellyfish)" VERSION_CODENAME=jammy ID=ubuntu ID_LIKE=debian - **Browser (if applicable):** [e.g., Chrome 100.0, Firefox 98.0] **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: When I say hello it should have a proper response, the modal works fine because I can boot up the ollama modal and get a proper response, but when I use openUI its giving me back blank brackets [Describe what you expected to happen.] ## Actual Behavior: When I ask anything it returns empty brackets [Describe what actually happened.] Used to work just fine before the 8-7-2024 version update, it was working fine yesterday, today I got the latest version and it stopped working, everything return blank brackets ## Description **Bug Summary:** [Provide a brief but clear summary of the bug] All response are returning blank brackets, when I type Hello, the response is a blank {} ## Reproduction Details **Steps to Reproduce:** [Outline the steps to reproduce the bug. Be as detailed as possible.] ## Logs and Screenshots To create a public link, set `share=True` in `launch()`. Startup time: 44.4s (prepare environment: 39.0s, import torch: 2.1s, import gradio: 0.5s, setup paths: 0.6s, initialize shared: 0.2s, other imports: 0.3s, list SD models: 0.1s, load scripts: 0.6s, create ui: 0.5s, gradio launch: 0.1s, add APIs: 0.2s). Creating model from config: /home/mylinux/stablediff/stable-diffusion-webui/configs/v1-inference.yaml /home/mylinux/stablediff/stable-diffusion-webui/venv/lib/python3.10/site-packages/huggingface_hub/file_download.py:1150: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. warnings.warn( Applying attention optimization: Doggettx... done. Model loaded in 14.5s (load weights from disk: 1.0s, create model: 0.4s, apply weights to model: 12.6s, apply dtype to VAE: 0.1s, calculate empty prompt: 0.3s). **Browser Console Logs:** [Include relevant browser console logs, if applicable] controller AbortController {signal: AbortSignal}signal: AbortSignalaborted: falseonabort: nullreason: undefined[[Prototype]]: AbortSignal[[Prototype]]: AbortController Chat.svelte:847 {"model":"llama3.1:latest","created_at":"2024-08-09T05:41:05.205502016Z","message":{"role":"assistant","content":"{}"},"done":false} Chat.svelte:847 {"model":"llama3.1:latest","created_at":"2024-08-09T05:41:05.246354509Z","message":{"role":"assistant","content":""},"done_reason":"stop","done":true,"total_duration":514822104,"load_duration":93065336,"prompt_eval_count":10,"prompt_eval_duration":309113000,"eval_count":2,"eval_duration":41858000} +layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest'][[Prototype]]: Object +layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest'][[Prototype]]: Object +layout.svelte:130 usage {models: Array(1)}models: Array(1)0: "llama3.1:latest"length: 1[[Prototype]]: Array(0)[[Prototype]]: Object +layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest'][[Prototype]]: Object +layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest'][[Prototype]]: Object +layout.svelte:130 usage {models: Array(0)} **Docker Container Logs:** [Include relevant Docker container logs, if applicable] glibc version is 2.35 Check TCMalloc: libtcmalloc_minimal.so.4 libtcmalloc_minimal.so.4 is linked with libc.so,execute LD_PRELOAD=/lib/x86_64-linux-gnu/libtcmalloc_minimal.so.4 Python 3.10.6 (main, Oct 24 2022, 16:07:47) [GCC 11.2.0] Version: v1.10.1 Commit hash: 82a973c04367123ae98bd9abdf80d9eda9b910e2 Installing clip Installing open_clip Installing requirements Launching Web UI with arguments: --listen --api no module 'xformers'. Processing without... no module 'xformers'. Processing without... No module 'xformers'. Proceeding without it. Loading weights [6ce0161689] from /home/mylinux/stablediff/stable-diffusion-webui/models/Stable-diffusion/v1-5-pruned-emaonly.safetensors **Screenshots/Screen Recordings (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ![image](https://github.com/user-attachments/assets/310395b4-c50f-4cef-a59c-ee1447bb8385) ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#13633