mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
[GH-ISSUE #4491] All response are returning blank brackets #52299
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @hanjin66 on GitHub (Aug 9, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/4491
Bug Report
Installation Method
[Describe the method you used to install the project, e.g., git clone, Docker, pip, etc.]
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:latest
Environment
WSL
v0.3.12
ollama version is 0.1.44
PRETTY_NAME="Ubuntu 22.04.4 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04.4 LTS (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
Confirmation:
Expected Behavior:
When I say hello it should have a proper response, the modal works fine because I can boot up the ollama modal and get a proper response, but when I use openUI its giving me back blank brackets
[Describe what you expected to happen.]
Actual Behavior:
When I ask anything it returns empty brackets
[Describe what actually happened.]
Used to work just fine before the 8-7-2024 version update, it was working fine yesterday, today I got the latest version and it stopped working, everything return blank brackets
Description
Bug Summary:
[Provide a brief but clear summary of the bug]
All response are returning blank brackets, when I type Hello, the response is a blank {}
Reproduction Details
Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]
Logs and Screenshots
To create a public link, set

share=Trueinlaunch().Startup time: 44.4s (prepare environment: 39.0s, import torch: 2.1s, import gradio: 0.5s, setup paths: 0.6s, initialize shared: 0.2s, other imports: 0.3s, list SD models: 0.1s, load scripts: 0.6s, create ui: 0.5s, gradio launch: 0.1s, add APIs: 0.2s).
Creating model from config: /home/mylinux/stablediff/stable-diffusion-webui/configs/v1-inference.yaml
/home/mylinux/stablediff/stable-diffusion-webui/venv/lib/python3.10/site-packages/huggingface_hub/file_download.py:1150: FutureWarning:
resume_downloadis deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, useforce_download=True.warnings.warn(
Applying attention optimization: Doggettx... done.
Model loaded in 14.5s (load weights from disk: 1.0s, create model: 0.4s, apply weights to model: 12.6s, apply dtype to VAE: 0.1s, calculate empty prompt: 0.3s).
Browser Console Logs:
[Include relevant browser console logs, if applicable]
controller AbortController {signal: AbortSignal}signal: AbortSignalaborted: falseonabort: nullreason: undefinedPrototype: AbortSignalPrototype: AbortController
Chat.svelte:847 {"model":"llama3.1:latest","created_at":"2024-08-09T05:41:05.205502016Z","message":{"role":"assistant","content":"{}"},"done":false}
Chat.svelte:847 {"model":"llama3.1:latest","created_at":"2024-08-09T05:41:05.246354509Z","message":{"role":"assistant","content":""},"done_reason":"stop","done":true,"total_duration":514822104,"load_duration":93065336,"prompt_eval_count":10,"prompt_eval_duration":309113000,"eval_count":2,"eval_duration":41858000}
+layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest']Prototype: Object
+layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest']Prototype: Object
+layout.svelte:130 usage {models: Array(1)}models: Array(1)0: "llama3.1:latest"length: 1Prototype: Array(0)Prototype: Object
+layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest']Prototype: Object
+layout.svelte:130 usage {models: Array(1)}models: ['llama3.1:latest']Prototype: Object
+layout.svelte:130 usage {models: Array(0)}
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
glibc version is 2.35
Check TCMalloc: libtcmalloc_minimal.so.4
libtcmalloc_minimal.so.4 is linked with libc.so,execute LD_PRELOAD=/lib/x86_64-linux-gnu/libtcmalloc_minimal.so.4
Python 3.10.6 (main, Oct 24 2022, 16:07:47) [GCC 11.2.0]
Version: v1.10.1
Commit hash: 82a973c04367123ae98bd9abdf80d9eda9b910e2
Installing clip
Installing open_clip
Installing requirements
Launching Web UI with arguments: --listen --api
no module 'xformers'. Processing without...
no module 'xformers'. Processing without...
No module 'xformers'. Proceeding without it.
Loading weights [6ce0161689] from /home/mylinux/stablediff/stable-diffusion-webui/models/Stable-diffusion/v1-5-pruned-emaonly.safetensors
Screenshots/Screen Recordings (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Additional Information
[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!