mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
issue: Chat no longer works after an update #4497
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @mistrjirka on GitHub (Mar 19, 2025).
Check Existing Issues
Installation Method
Git Clone
Open WebUI Version
v0.5.20
Ollama Version (if applicable)
ollama version is 0.6.1
Operating System
Arch linux
Browser (if applicable)
Firefox
Confirmation
README.md.Expected Behavior
When I send message to AI I expect it to respond.
Actual Behavior
Ai never responds just acts as if it was loading.
Steps to Reproduce
Update to latest version of openwebui and ollama and try to use the openwebui.
Logs & Screenshots
I looked through the logs and I cannot find anything that would suggest something broken. The Website is infinitely waiting on post request "new"
This is the request that is infinitely loading as cURL:
curl 'http://10.154.224.4:8080/api/v1/chats/new' -X POST -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:136.0) Gecko/20100101 Firefox/136.0' -H 'Accept: application/json' -H 'Accept-Language: cs,sk;q=0.8,en-US;q=0.5,en;q=0.3' -H 'Accept-Encoding: gzip, deflate' -H 'Referer: http://10.154.224.4:8080/' -H 'Content-Type: application/json' -H 'authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpZCI6IjBmOWMyZWQ0LTBiYjEtNDU4Ny04NjJiLWEzZjNmZDg1NDE1NyJ9.fcNihgrjUO8QZhHYPAvv9VJ0OlM5Csf95BGvpoc83Uc' -H 'Origin: http://10.154.224.4:8080' -H 'Connection: keep-alive' -H 'Cookie: token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpZCI6IjBmOWMyZWQ0LTBiYjEtNDU4Ny04NjJiLWEzZjNmZDg1NDE1NyJ9.fcNihgrjUO8QZhHYPAvv9VJ0OlM5Csf95BGvpoc83Uc' -H 'Priority: u=0' -H 'Pragma: no-cache' -H 'Cache-Control: no-cache' --data-raw $'{"chat":{"id":"","title":"Nov\xfd chat","models":["deepseek-r1:1.5b"],"params":{},"history":{"messages":{"7d5009b8-b990-41d0-a75e-2f406db93e8d":{"id":"7d5009b8-b990-41d0-a75e-2f406db93e8d","parentId":null,"childrenIds":[],"role":"user","content":"Jak se vede. Um\xed\u0161 \u010desky?","timestamp":1742412270,"models":["deepseek-r1:1.5b"]}},"currentId":"7d5009b8-b990-41d0-a75e-2f406db93e8d"},"messages":[{"id":"7d5009b8-b990-41d0-a75e-2f406db93e8d","parentId":null,"childrenIds":[],"role":"user","content":"Jak se vede. Um\xed\u0161 \u010desky?","timestamp":1742412270,"models":["deepseek-r1:1.5b"]}],"tags":[],"timestamp":1742412270120}}'open-webui.log
ollama.log
console-export-2025-3-19_20-41-48.txt
Additional Information
I tested that ollama works when I do ollama run.
I am currently tunneled to the machine using wireguard vpn and I was not sure if I should use the url in the api calls of the vpn ip or the local ip of the machine. I tried both and it shows for both that the connection to Ollama is working (when I click on verify).
I am running ollama locally and open-webui in docker. The system worked before udpating ollama and openwebui to the latest version
Bit disturbing is that I do not see any error in the logs