mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-05 18:38:17 -05:00
[GH-ISSUE #3790] losing connection during llm model loading or token generation #13385
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @linxOD on GitHub (Jul 11, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/3790
I loaded the latest docker image v0.3.8:
Using open webui via localhost worked fine but switching to a reverse proxy had some issues. Some of them I could resolve by reading closed issues like:
and updating the server config. I also updated the server timeout to 10min.
However, one remaining issue I could no find anything about.
Especially with larger models but not only, I receive this error in the browser console:
<my-proxy>/ollama/api/chat net::ERR_NETWORK_CHANGED 200 (OK)The error appears either when a new model is loading in the ollama container or during token gerneration and the stream just breaks up.
I do not see any errors in the ollama docker container logs and also no errors in the open-webui container logs.
Any Ideas?