Network Problem on chat #3407

Closed
opened 2025-11-11 15:31:05 -06:00 by GiteaMirror · 2 comments
Owner

Originally created by @Zorgonaute84 on GitHub (Jan 24, 2025).

Hello all,

The last 6 months, everything worked well. I don't use Ollama during 2 or 3 months, then, since my last update, I can chat without "Network Problem" error message.

I have 2 separate instance, one for Open Webui and one for Ollama. Both are running on docker with this 2 commands :

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

Trough Open WebUI, I can connect to Ollama server, check the connection, and download models. Everything looks OK, but when I tried to discuss with the model, I have a "Network Problem" issue without no more details.

Image

Any Idea ?

Thanks :-)

Originally created by @Zorgonaute84 on GitHub (Jan 24, 2025). Hello all, The last 6 months, everything worked well. I don't use Ollama during 2 or 3 months, then, since my last update, I can chat without "Network Problem" error message. I have 2 separate instance, one for Open Webui and one for Ollama. Both are running on docker with this 2 commands : `docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main` `docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama` Trough Open WebUI, I can connect to Ollama server, check the connection, and download models. Everything looks OK, but when I tried to discuss with the model, I have a "Network Problem" issue without no more details. ![Image](https://github.com/user-attachments/assets/5501cbdc-3d22-4ff4-83c4-dea98a129db1) Any Idea ? Thanks :-)
Author
Owner

@tjbck commented on GitHub (Jan 24, 2025):

#8074

@tjbck commented on GitHub (Jan 24, 2025): #8074
Author
Owner

@wyl2000 commented on GitHub (Jan 25, 2025):

The latest version using the default 0.0.0.0:8080 to access the dialog will appear Network Problem prompt, please change to 127.0.0.1:8080 to access and dialog, so it is normal, I do not know why this is the case, but I access this way it is normal, I was in windows 11 using conda to build a virtual environment created! The project normally uses the v2ray automatic system agent. After successful installation can be used without proxy, but the latest 0.5.7 version to use 127.0.0.1:8080 to normal use!

@wyl2000 commented on GitHub (Jan 25, 2025): The latest version using the default 0.0.0.0:8080 to access the dialog will appear Network Problem prompt, please change to 127.0.0.1:8080 to access and dialog, so it is normal, I do not know why this is the case, but I access this way it is normal, I was in windows 11 using conda to build a virtual environment created! The project normally uses the v2ray automatic system agent. After successful installation can be used without proxy, but the latest 0.5.7 version to use 127.0.0.1:8080 to normal use!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#3407