[GH-ISSUE #3790] losing connection during llm model loading or token generation #13385

Closed
opened 2026-04-19 20:08:49 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @linxOD on GitHub (Jul 11, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/3790

I loaded the latest docker image v0.3.8:

podman run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v /home/llm/ollama/data/lioe:/app/backend/data:z --name open-webui --restart always ghcr.io/open-webui/open-webui:latest

Using open webui via localhost worked fine but switching to a reverse proxy had some issues. Some of them I could resolve by reading closed issues like:

and updating the server config. I also updated the server timeout to 10min.

However, one remaining issue I could no find anything about.
Especially with larger models but not only, I receive this error in the browser console:
<my-proxy>/ollama/api/chat net::ERR_NETWORK_CHANGED 200 (OK)
The error appears either when a new model is loading in the ollama container or during token gerneration and the stream just breaks up.

I do not see any errors in the ollama docker container logs and also no errors in the open-webui container logs.

Any Ideas?

Originally created by @linxOD on GitHub (Jul 11, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/3790 I loaded the latest docker image v0.3.8: ```bash podman run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v /home/llm/ollama/data/lioe:/app/backend/data:z --name open-webui --restart always ghcr.io/open-webui/open-webui:latest ``` Using open webui via localhost worked fine but switching to a reverse proxy had some issues. Some of them I could resolve by reading closed issues like: * [Nginx Config](https://github.com/open-webui/open-webui/discussions/1235#discussioncomment-9341387) * [Websocket](https://github.com/open-webui/open-webui/issues/3054#issuecomment-2163426012) and updating the server config. I also updated the server timeout to 10min. However, one remaining issue I could no find anything about. Especially with larger models but not only, I receive this error in the browser console: `<my-proxy>/ollama/api/chat net::ERR_NETWORK_CHANGED 200 (OK)` The error appears either when a new model is loading in the ollama container or during token gerneration and the stream just breaks up. I do not see any errors in the ollama docker container logs and also no errors in the open-webui container logs. Any Ideas?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#13385