mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-12 10:04:14 -05:00
bug: Uncaught (in promise) TypeErrors in Open WebUI, causes page freeze & browser console errors until refresh #899
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @silentoplayz on GitHub (May 12, 2024).
Bug Report
Description
Bug Summary:
Uncaught (in promise) TypeErrorsin Open WebUI, causing unexpected page freeze-like behavior along with a few console log errors produced to potentially help debug the issues.Expected Behavior:
Uncaught (in promise) TypeErrors, requiring a manual refresh by the user to unfreeze the page and clear out browser console log errors.Actual Behavior:
Uncaught (in promise) TypeErrors, followed by a series of error messages related toImmutable, requiring a manual refresh by the user to unfreeze the page to solve.Environment
Confirmation:
Logs and Screenshots
Browser Console Logs + Reproduction Details:
1st error (browser console log):
Steps to Reproduce 1st Error:
F12on your keyboard to open up your browser console and switch to theConsoletab to observe error.Editbutton of the first message you sent in the chat (before the LLM's response in this chat) and observe new error:Editbutton of the LLM's response to your message and observe another new error:2nd error; not sure if this is a huge issue at all though, as this doesn't appear to cause any issues and is unrelated to the actual bug report I'm making here (browser console log):
This error occurred upon signing out of my Open WebUI account:
This error occurred upon signing into my Open WebUI account:
Docker Container Logs:
Installation Method
Docker Desktop via a custom-built docker-compose.yml file for my Open WebUI instance and domains. Ollama is running natively on Windows via the Windows (Preview) version.
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Please let me know if I've missed anything or if there's any additional information needed!
@tjbck commented on GitHub (May 13, 2024):
1st issue should be fixed on dev!
@Yanyutin753 commented on GitHub (May 22, 2024):
Every time I delete information, the page fails to report an error and cannot be used normally. After refreshing the page, it can only be used normally.
Console error
@kojdj0811 commented on GitHub (Jun 9, 2024):
I noticed a similar error. I'm using a 70b model on a slow GPU (P40), and I get an error at exactly 5 minutes and the text won't update.
Even after the above phenomenon occurred, the AI was still generating answers on the server. After the operation of the AI for answers was over, it was confirmed that VRAM was cleared after an additional 5 minutes as the default setting of OLLAMA_KEEP_ALIVE.
@silentoplayz commented on GitHub (Jun 11, 2024):
I'm glad I could help by reporting bugs I've found that you then have fixed to make Open WebUI better for everyone to enjoy. Thanks for your hard work Tim. 🍻
@Algorithm5838 commented on GitHub (Jun 17, 2024):
I'm seeing the same bug and getting this error in the console when I delete the second message:
@tjbck commented on GitHub (Jun 17, 2024):
The second issue should also be fixed on latest dev!
@Algorithm5838 commented on GitHub (Jun 17, 2024):
Thanks! I can confirm that the issue has been fixed in the new update, v0.3.5.
@silentoplayz commented on GitHub (Jun 17, 2024):
Hype! You've fixed the bug I've had for so long. It appears that throughout the existence of this bug report, all bugs I've reported within it's time on this specific bug report, has been fixed. Thanks for all your hard work @tjbck! ❤️
I believe this bug report can be closed as completed now.
@damajor commented on GitHub (Jul 21, 2024):
I have a similar issue on my instance.
I think it can be easily reproduced following these steps:
Well I am not able to reproduce on freshly created chats again, maybe it comes from old chats i havent used for a while (older than one day), I will update tomorrow or delete my comment if no more issue.
@silentoplayz commented on GitHub (Jul 21, 2024):
@damajor I am unable to reproduce this particular
Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'model')error on the latest dev branch of Open WebUI, but I am able to produce and successfully reproduce a couple of new different errors that I have not yet come across before.Steps to reproduce the new errors in browser console that I've found:
The page will freeze beyond this point and requires a page refresh to fix. After refreshing the page, you can click the initial chat that you cloned from and get the same error in the browser console.
Note: Reproducing the initial error discussed in this comment not only causes the page to freeze, but also requires a page refresh to resolve the issue. These errors are reproducible with both local (Ollama) and external connection models.
@silentoplayz commented on GitHub (Jul 22, 2024):
@kojdj0811 A related merge that should fix this issue: https://github.com/open-webui/open-webui/pull/3107
@damajor commented on GitHub (Jul 22, 2024):
Well maybe the UI was in a bad state when I get the issue, but I am not able to reproduce the problem atm.
@silentoplayz commented on GitHub (Aug 6, 2024):
Closing in favor of https://github.com/open-webui/open-webui/issues/4408
@lehhair commented on GitHub (Dec 23, 2024):
Now I still have this problem when I use /quick type prompt