basic docker set up question #490

Closed
opened 2025-11-11 14:22:53 -06:00 by GiteaMirror · 6 comments
Owner

Originally created by @LaptopDev on GitHub (Mar 17, 2024).

I am accessing the webui remotely, running both ollama and open-webui on my remote server.
I want models to run on my powerful server, and want to interact with them through the webui.

For some reason the instructions do not explicitly inform you of what docker run command to use for this configuration. There are two docker run commands to choose from, one which is used for local connection to a local ollama server, which I chose, and another which is used to connect to a remote ollama server.

I can not connect to ollama. I cannot add models. On my server I can list 2 models available in terminal with ollama which do not show up in open-webui. Do I need to use the other docker run command?

Originally created by @LaptopDev on GitHub (Mar 17, 2024). I am accessing the webui remotely, running both ollama and open-webui on my remote server. I want models to run on my powerful server, and want to interact with them through the webui. For some reason the instructions do not explicitly inform you of what docker run command to use for this configuration. There are two docker run commands to choose from, one which is used for local connection to a local ollama server, which I chose, and another which is used to connect to a remote ollama server. I can not connect to ollama. I cannot add models. On my server I can list 2 models available in terminal with ollama which do not show up in open-webui. Do I need to use the other docker run command?
Author
Owner

@justinh-rahb commented on GitHub (Mar 17, 2024):

The method of communication between the two containers remains consistent whether they're hosted on a remote server or your local machine, provided they're deployed together. The only change involves how you interact with the WebUI; instead of using http://localhost:3000, you'd connect via the server's IP address or hostname. Given this setup, leveraging Docker Compose for installation is advisable. This approach not only simplifies managing the containers as a single unit but also makes full use of Docker's capabilities on a server environment.

https://docs.openwebui.com/getting-started/#using-docker-compose

@justinh-rahb commented on GitHub (Mar 17, 2024): The method of communication between the two containers remains consistent whether they're hosted on a remote server or your local machine, provided they're deployed together. The only change involves how you interact with the WebUI; instead of using http://localhost:3000, you'd connect via the server's IP address or hostname. Given this setup, leveraging Docker Compose for installation is advisable. This approach not only simplifies managing the containers as a single unit but also makes full use of Docker's capabilities on a server environment. https://docs.openwebui.com/getting-started/#using-docker-compose
Author
Owner

@LaptopDev commented on GitHub (Mar 17, 2024):

... ...communication between the two containers... ...provided they're deployed together. ... ... Given this setup, leveraging Docker Compose for installation is advisable. This approach ... ... simplifies managing the containers as a single unit but also makes full use of Docker's capabilities on a server environment.

It seems to me you are under the impression that I ran both in containers. But I technically did follow the instructions since my binary execution of 'ollama serve' is "on my computer".

If Ollama is on your computer, use this command:

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

But what I am understanding from your reply is this means they are not "deployed together", and must be (making the quick start documentation inapplicable for those running ollama as a binary). Do I got that right?

I am connecting via the server's IP address fine, and assume that I would be doing the same with a set up involving docker-compose.yml/compose.yml according to the documentation you linked. But it seems there is some misunderstanding about my set up configuration still and what is causing it to not connect to ollama.

@LaptopDev commented on GitHub (Mar 17, 2024): > ... ...communication between the two containers... ...provided they're deployed together. ... ... Given this setup, leveraging Docker Compose for installation is advisable. This approach ... ... simplifies managing the containers as a single unit but also makes full use of Docker's capabilities on a server environment. > It seems to me you are under the impression that I ran both in containers. But I technically did follow the instructions since my binary execution of 'ollama serve' is "on my computer". ``` If Ollama is on your computer, use this command: docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` But what I am understanding from your reply is this means they are not "deployed together", and must be (making the quick start documentation inapplicable for those running ollama as a binary). Do I got that right? I am connecting via the server's IP address fine, and assume that I would be doing the same with a set up involving docker-compose.yml/compose.yml according to the documentation you linked. But it seems there is some misunderstanding about my set up configuration still and what is causing it to not connect to ollama.
Author
Owner

@justinh-rahb commented on GitHub (Mar 17, 2024):

Now I understand your deployment scenario better:

docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=http://your-computer-ip:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
@justinh-rahb commented on GitHub (Mar 17, 2024): Now I understand your deployment scenario better: ```bash docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=http://your-computer-ip:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ```
Author
Owner

@LaptopDev commented on GitHub (Mar 17, 2024):

docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=http://your-computer-ip:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

This doesn't seem to work for me.

server:~$ pgrep -a ollama
33395 ollama-linux-amd64 serve
server:~$ sudo netstat -tulnp | grep <PID>
bash: syntax error near unexpected token `newline'
server:~$ sudo netstat -tulnp | grep 33395
[sudo] password for user: 
tcp        0      0 127.0.0.1:11434         0.0.0.0:*               LISTEN      33395/ollama-linux- 
server2:~$ 

but as you can see the process is running on said port. Does the executable name have to match 'ollama'?

@LaptopDev commented on GitHub (Mar 17, 2024): > docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=http://your-computer-ip:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main This doesn't seem to work for me. ``` server:~$ pgrep -a ollama 33395 ollama-linux-amd64 serve server:~$ sudo netstat -tulnp | grep <PID> bash: syntax error near unexpected token `newline' server:~$ sudo netstat -tulnp | grep 33395 [sudo] password for user: tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN 33395/ollama-linux- server2:~$ ``` but as you can see the process is running on said port. Does the executable name have to match 'ollama'?
Author
Owner

@justinh-rahb commented on GitHub (Mar 17, 2024):

Your Ollama is running on a server or your desktop/laptop, which is it? I'm getting mixed messages here which is confusing my ability to assist. Your real problem is obvious from the error output above: your Ollama is only listening on the 127.0.0.1 interface, you need it to listen to all interfaces by setting OLLAMA_HOST=0.0.0.0

Read their FAQ:
https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux

@justinh-rahb commented on GitHub (Mar 17, 2024): Your Ollama is running on a server or your desktop/laptop, which is it? I'm getting mixed messages here which is confusing my ability to assist. Your real problem is obvious from the error output above: your Ollama is only listening on the 127.0.0.1 interface, you need it to listen to all interfaces by setting `OLLAMA_HOST=0.0.0.0` Read their FAQ: https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux
Author
Owner

@LaptopDev commented on GitHub (Mar 17, 2024):

I see. Thank you for your patience! I got it working now.

@LaptopDev commented on GitHub (Mar 17, 2024): I see. Thank you for your patience! I got it working now.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#490