[GH-ISSUE #8429] PLEASE HELP ! - 2 DIFFERENT WEB GUI'S #30651

Closed
opened 2026-04-25 04:54:35 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @MADPANDA3D on GitHub (Jan 9, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/8429

Discussed in https://github.com/open-webui/open-webui/discussions/8428

Originally posted by MADPANDA3D January 9, 2025
Hello, let me start by saying I am not a programmer or developer in any way shape or form I am learning, I set up ollama to run on my computer through Linux (WSL). I have the open-webui running in a docker container, with watchtower running as well. As I explored the options on how to be able to access my home based AI system I decided to use Ngrok.

Initially Ngrok was running through WSL. ( I have the paid service with an ngrok domain ). I was able to access my open-webui GUI through the domain however it brought up a new open-webui GUI and not the same locally hosted one I have. I know this for a fact for a few reasons.

    1. The versions are different my locally hosted thats watched by watchtower in docker is updated to the new V5.4
    1. The Web GUI accessed via my link is stuck in Version 5.2, also does not have the same system memory as the local 8080 because chats are different and settings are not the same.
  1. the open-webui that I open on my domain is tracking the openwebui running on 127.0.0.1 which I thought was the same as localhost:8080 but my localhost:8080 and 127.0.0.1 on my local computer bring up 2 different web GUI's

I have been trying to use a mix of youtube and chatgpt to get this fixed and stabilized so that I can start the finetuning process of a model and implement it, I am just trying to finish the simple framework first.

image
image
image

Now I have confirmed that the 127.0.0.1 version of the GUI is the one linked to the Ngrok tunnel but the tunnel is set to 8080. Initially I ran Ngrok straight on WSL but I decide tomake an ngrok container to try and get the open-webui container and ngrok container on the same network in hopes that this would fix the problem.

IT DID NOT

Instead now when I run ngrok it runs through docker and still connects to the 127 version. Now you might say "your open-webui container must be activating the wrong one" I hear you so I stopped the open-webui container and my localhosrt:8080 webGUI stops working but my 127 one is still running and seems to always be runnning im not sure how or where or why.

I have been dealing with this issue alone for 4 days and I have hit a brick wall and Chatgpt is sending me in circles. I need help any suggestions, ideas, comments, questions ? If you need more info to help me diagnose the problem let me know. I may just be dummy and it may be something simple.

Originally created by @MADPANDA3D on GitHub (Jan 9, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/8429 ### Discussed in https://github.com/open-webui/open-webui/discussions/8428 <div type='discussions-op-text'> <sup>Originally posted by **MADPANDA3D** January 9, 2025</sup> Hello, let me start by saying I am not a programmer or developer in any way shape or form I am learning, I set up ollama to run on my computer through Linux (WSL). I have the open-webui running in a docker container, with watchtower running as well. As I explored the options on how to be able to access my home based AI system I decided to use Ngrok. Initially Ngrok was running through WSL. ( I have the paid service with an ngrok domain ). I was able to access my open-webui GUI through the domain however it brought up a new open-webui GUI and not the same locally hosted one I have. I know this for a fact for a few reasons. - 1. The versions are different my locally hosted thats watched by watchtower in docker is updated to the new V5.4 - 2. The Web GUI accessed via my link is stuck in Version 5.2, also does not have the same system memory as the local 8080 because chats are different and settings are not the same. 3. the open-webui that I open on my domain is tracking the openwebui running on 127.0.0.1 which I thought was the same as localhost:8080 but my localhost:8080 and 127.0.0.1 on my local computer bring up 2 different web GUI's I have been trying to use a mix of youtube and chatgpt to get this fixed and stabilized so that I can start the finetuning process of a model and implement it, I am just trying to finish the simple framework first. ![image](https://github.com/user-attachments/assets/0293dbf9-05a7-40b2-b9db-294fa3a3f449) ![image](https://github.com/user-attachments/assets/e2be063f-7075-49bd-83d1-e8bd0374485f) ![image](https://github.com/user-attachments/assets/f57ab986-808a-4c20-906e-d3ea8ff9f64a) Now I have confirmed that the 127.0.0.1 version of the GUI is the one linked to the Ngrok tunnel but the tunnel is set to 8080. Initially I ran Ngrok straight on WSL but I decide tomake an ngrok container to try and get the open-webui container and ngrok container on the same network in hopes that this would fix the problem. IT DID NOT Instead now when I run ngrok it runs through docker and still connects to the 127 version. Now you might say "your open-webui container must be activating the wrong one" I hear you so I stopped the open-webui container and my localhosrt:8080 webGUI stops working but my 127 one is still running and seems to always be runnning im not sure how or where or why. I have been dealing with this issue alone for 4 days and I have hit a brick wall and Chatgpt is sending me in circles. I need help any suggestions, ideas, comments, questions ? If you need more info to help me diagnose the problem let me know. I may just be dummy and it may be something simple. </div>
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#30651