mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #1] docker network access error #11897
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @slychief on GitHub (Oct 17, 2023).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1
Originally assigned to: @tjbck on GitHub.
Hi,
when running the docker build command via the run.sh script, I get the following error messages:
When accessing the web-gui I get an "500 Internal Error".
Any advice how to fix this?
Best regards,
Alexander
@tjbck commented on GitHub (Oct 18, 2023):
Hi Alexander,
Could you verify if the Ollama server is running on http://127.0.0.1:11434/?
Thanks.
@slychief commented on GitHub (Oct 18, 2023):
yes. curl http://127.0.0.1:11434 answers "ollama is running"
@tjbck commented on GitHub (Oct 18, 2023):
Hi there,
Are you using Windows by any chance? If so, please try adding the "--add-host=host.docker.internal:host-gateway" to the docker command. I've recently updated the run.sh file, so kindly pull the latest code and test it out. Let me know if it resolves the issue.
Alternatively, if the aforementioned solution doesn't work, you can try running Ollama with the following configuration:
Keep me posted on your progress.
Thanks!
@slychief commented on GitHub (Oct 18, 2023):
Hi,
the docker "--add-host=..." param seem to have solved the 500 error. I can also confirm, that the 500 error re-appears when ollama is not running.
Running ollama with the OLLAMA_HOST enviroment variable set as recommended also made everything work.
The only issue that I'm still having is, that it does not work with the ollama docker image. I'm running the container using the following command:
docker run -d --gpus device=5 -v ollama:/root/.ollama -p 0.0.0.0:11434:11434 --name ollama ollama/ollamaThe GUI is running, I can select models. But when I submit a prompt, the GUI shows that it is being processed, but nothing else happens. It just hangs indefinitely.
Actually, I don't rely on Docker. But I need to restrict ollama to a specific GPU, otherwise it would just grab all available GPUs in the server. Do you know how to parameterize ollama to use a specific GPU?
@deputynl commented on GitHub (Oct 18, 2023):
I also struggle. I run the container with the below command. Ollama is running on 10,1.1.8:11434 and I can use the REST API, but when using the webui, I get no answer on any request.
docker run -d -p 4444:3000 --add-host=host.docker.internal:10.1.1.8 --name ollama-webui --restart always ollama-webui
@tjbck commented on GitHub (Oct 18, 2023):
Hi,
I'm glad to hear that the issue has been resolved and that it's now working for you. Regarding the Ollama docker image, you could try adding --net=host to make the container appear as if it's running on the host itself. The command would look something like this:
I haven't personally tested the command, so I can't guarantee it will work. Please let me know if the above command works for you.
Feel free to reach out if you need further assistance.
@tjbck commented on GitHub (Oct 18, 2023):
Hi @depuytnl,
Currently, the Ollama WebUI requires Ollama to be accessible from the same IP address as the Ollama WebUI. To address this issue, I'll be adding a feature that allows you to edit the Ollama address via an environment variable soon.
Thank you for your patience, and I'll keep you posted on the progress of this feature. Feel free to reach out if you have any further questions or concerns.
@slychief commented on GitHub (Oct 18, 2023):
Hi,
no that hasn't solved it. I have posted the question on how to assign specific gpus to ollama serve in the ollama issue tracker. This would be a solution.
I will experiment with docker settings as well. If I find a solution I will post it to this thread.
Thanks for the great support!
@tjbck commented on GitHub (Oct 18, 2023):
Hi @depuytnl,
I have just implemented an environment variable that allows you to connect to the model when Ollama is hosted on a different server. You can utilize the environment variable -e OLLAMA_ENDPOINT="http://[insert your Ollama address]" to establish the connection.
For your specific use case, the following code should work:
Feel free to test it out, and please let me know if you encounter any issues or if you have any other questions.
@deputynl commented on GitHub (Oct 18, 2023):
Thanks for that very quick update. I have just tried it, but sadly the result is the same. It doesn't output any errors, is there anything I can switch on to provide better feedback? Thanks again!
@tjbck commented on GitHub (Oct 18, 2023):
@depuytnl
I'm sorry to hear that the environment variable didn't work as expected. Could you please provide the logs by using the following command:
This should provide us with all the logs to help us understand what might be causing the issue.
Additionally, were you able to run Ollama with the following command successfully?
Please let me know the outcome of these steps so I can assist you further.
@tjbck commented on GitHub (Oct 18, 2023):
@depuytnl
I also just noticed that one of the files wasn't pushed to the repository. Could you please try the same command with the latest release of the repository by pulling the latest changes?
Let me know if this resolves the issue or if you need any further assistance.
Thanks.
@deputynl commented on GitHub (Oct 19, 2023):
I just tried with the latest release, but still get the same result. The logs have the following content:
Listening on 0.0.0.0:3000
http://10.1.1.8:11434
I'm running the ollama container as provided by that project, so not sure how I could change the command. I did however run the below commands from inside the ollama-webui container, which I suspect show that the ollama rest interface is reachable from the ollama-webui container.
set
ENV='prod'
HOME='/root'
HOSTNAME='90eee2e42f0c'
IFS='
'
NODE_VERSION='20.8.1'
OLLAMA_ENDPOINT='http://10.1.1.8:11434'
OPTIND='1'
PATH='/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
PPID='0'
PS1='# '
PS2='> '
PS4='+ '
PWD='/app'
TERM='xterm'
YARN_VERSION='1.22.19'
curl http://10.1.1.8:11434
Ollama is running
@tjbck commented on GitHub (Oct 19, 2023):
@depuytnl
Thank you for sharing the logs and the details of the commands you ran within the ollama-webui container. Since the Ollama server appears to be reachable from the container, the issue is likely related to CORS.
If you were able to access the main page, it indicates that CORS might indeed be causing the problem. To help us further diagnose the issue, could you please provide a screenshot of your console logs from the browser's developer tools?
Additionally, to enable CORS from the Ollama server, it is necessary to run the following command:
Kindly execute this command for the Ollama server. After making these changes, please attempt to access the Ollama WebUI again to check if the issue is resolved.
Thanks.