[GH-ISSUE #5959] Ollama is running but can't acces it from OpenWebUI #3725

Closed
opened 2026-04-12 14:32:11 -05:00 by GiteaMirror · 20 comments
Owner

Originally created by @ns-bcr on GitHub (Jul 25, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5959

What is the issue?

Hey,
So, I am having a problem, I have Ollama runnig on ubuntu-server 24.04 LTS.
It works properly locally but from my computer I can't acces it, like from it, I can run : ollama run llama2 and it works.
I can also try to make this command from the server : curl localhost:11434 and it will say "Ollama is running"
But from my computer when I go to 192.168.1.145:11434 I get nothing. And if i try to run curl 192.168.1.145:11434 I get "curl: (7) Failed to connect to 192.168.1.145 port 80 after 0 ms: Couldn't connect to server"

I tried multiple thing :

  • I tried opening the port on TCP
  • I tried looking in the LOGS but nothing wrong
  • I also tried reinstalling it from zero.

I can provide more information if necessary.
Thx.

OS

Linux

GPU

Intel

CPU

Intel

Ollama version

0.2.8

Originally created by @ns-bcr on GitHub (Jul 25, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5959 ### What is the issue? Hey, So, I am having a problem, I have Ollama runnig on ubuntu-server 24.04 LTS. It works properly locally but from my computer I can't acces it, like from it, I can run : `ollama run llama2` and it works. I can also try to make this command from the server : `curl localhost:11434` and it will say "Ollama is running" But from my computer when I go to 192.168.1.145:11434 I get nothing. And if i try to run `curl 192.168.1.145:11434` I get "curl: (7) Failed to connect to 192.168.1.145 port 80 after 0 ms: Couldn't connect to server" I tried multiple thing : - I tried opening the port on TCP - I tried looking in the LOGS but nothing wrong - I also tried reinstalling it from zero. I can provide more information if necessary. Thx. ### OS Linux ### GPU Intel ### CPU Intel ### Ollama version 0.2.8
GiteaMirror added the bug label 2026-04-12 14:32:11 -05:00
Author
Owner

@rick-github commented on GitHub (Jul 25, 2024):

Add OLLAMA_HOST=0.0.0.0:11434 to the server environment and restart.

<!-- gh-comment-id:2251054734 --> @rick-github commented on GitHub (Jul 25, 2024): Add `OLLAMA_HOST=0.0.0.0:11434` to the server environment and restart.
Author
Owner

@ns-bcr commented on GitHub (Jul 25, 2024):

I tried but now its still weird... If I try to do curl 192.168.1.145:11434 on the server it works, but if I try from my computer on the browser, I still get nothing. Same thing on OpenWebUI. I tried disabling the firewall but nothing changed.

<!-- gh-comment-id:2251112083 --> @ns-bcr commented on GitHub (Jul 25, 2024): I tried but now its still weird... If I try to do curl 192.168.1.145:11434 on the server it works, but if I try from my computer on the browser, I still get nothing. Same thing on OpenWebUI. I tried disabling the firewall but nothing changed.
Author
Owner

@rick-github commented on GitHub (Jul 25, 2024):

On the server, what does sudo netstat -pant | grep :11434 show?

<!-- gh-comment-id:2251121245 --> @rick-github commented on GitHub (Jul 25, 2024): On the server, what does `sudo netstat -pant | grep :11434` show?
Author
Owner

@ns-bcr commented on GitHub (Jul 25, 2024):

it shows this:
image

<!-- gh-comment-id:2251174563 --> @ns-bcr commented on GitHub (Jul 25, 2024): it shows this: ![image](https://github.com/user-attachments/assets/c514f152-240a-4a54-a717-cb9fbb7ab437)
Author
Owner

@rick-github commented on GitHub (Jul 25, 2024):

ollama has the right port binding. I started a vserver and installed ubuntu-24.04-live-server-amd64 on it, installed ollama, made the adjustment to OLLAMA_HOST, and was able to get "Ollama is running" from a different machine. So there's either a firewall on your server, a firewall on your local computer, or some other network kit between the two that is blocking the connection.

What do you see if you run the following on your server: sudo tcpdump -i any dst port 11434 or src host <ip-of-your-local-machine> and then do curl <ip-or-name-of-server>:11434 on your local machine?

<!-- gh-comment-id:2251470110 --> @rick-github commented on GitHub (Jul 25, 2024): ollama has the right port binding. I started a vserver and installed ubuntu-24.04-live-server-amd64 on it, installed ollama, made the adjustment to OLLAMA_HOST, and was able to get "Ollama is running" from a different machine. So there's either a firewall on your server, a firewall on your local computer, or some other network kit between the two that is blocking the connection. What do you see if you run the following on your server: `sudo tcpdump -i any dst port 11434 or src host <ip-of-your-local-machine>` and then do `curl <ip-or-name-of-server>:11434` on your local machine?
Author
Owner

@ns-bcr commented on GitHub (Jul 25, 2024):

Its going even worst. When I type sudo tcpdump -i any dst port 11434 or src host its start listing port in loop. and now curl doesn't even work anymore? Like I doesn't get any awnsers at all? for curl localhost:11434 or 192.168.1.145:11434 or even 127.0.0.1:11434, nothing.
I am gonna try to reinstall from zero another machine and see if anything changed.

<!-- gh-comment-id:2251506124 --> @ns-bcr commented on GitHub (Jul 25, 2024): Its going even worst. When I type sudo tcpdump -i any dst port 11434 or src host <ip-of-your-local-machine> its start listing port in loop. and now curl doesn't even work anymore? Like I doesn't get any awnsers at all? for curl localhost:11434 or 192.168.1.145:11434 or even 127.0.0.1:11434, nothing. I am gonna try to reinstall from zero another machine and see if anything changed.
Author
Owner

@rick-github commented on GitHub (Jul 25, 2024):

Try it without 'or src host xx'

<!-- gh-comment-id:2251507560 --> @rick-github commented on GitHub (Jul 25, 2024): Try it without 'or src host xx'
Author
Owner

@rick-github commented on GitHub (Jul 25, 2024):

or better: sudo tcpdump -i any \( dst port 11434 or src host <ip-of-your-local-machine> \) and not port 22

<!-- gh-comment-id:2251511852 --> @rick-github commented on GitHub (Jul 25, 2024): or better: `sudo tcpdump -i any \( dst port 11434 or src host <ip-of-your-local-machine> \) and not port 22`
Author
Owner

@ns-bcr commented on GitHub (Jul 25, 2024):

Still same, it start listing in loop ports

<!-- gh-comment-id:2251516916 --> @ns-bcr commented on GitHub (Jul 25, 2024): Still same, it start listing in loop ports
Author
Owner

@rick-github commented on GitHub (Jul 25, 2024):

OK, sudo tcpdump -i any dst port 11434 or src port 11434.

<!-- gh-comment-id:2251518985 --> @rick-github commented on GitHub (Jul 25, 2024): OK, `sudo tcpdump -i any dst port 11434 or src port 11434`.
Author
Owner

@ns-bcr commented on GitHub (Jul 25, 2024):

Nothing changed

<!-- gh-comment-id:2251520211 --> @ns-bcr commented on GitHub (Jul 25, 2024): Nothing changed
Author
Owner

@rick-github commented on GitHub (Jul 25, 2024):

sudo tcpdump -i any dst port 11434 or src port 11434 prints lots of output? are you pulling a model?

<!-- gh-comment-id:2251522389 --> @rick-github commented on GitHub (Jul 25, 2024): `sudo tcpdump -i any dst port 11434 or src port 11434` prints lots of output? are you pulling a model?
Author
Owner

@ns-bcr commented on GitHub (Jul 25, 2024):

I am a bit lost actully rn, I'll try tomorrow.

<!-- gh-comment-id:2251531805 --> @ns-bcr commented on GitHub (Jul 25, 2024): I am a bit lost actully rn, I'll try tomorrow.
Author
Owner

@ns-bcr commented on GitHub (Jul 26, 2024):

Ok so I wen't running another machine with ollama and still doesn't work. Weird.

<!-- gh-comment-id:2252419757 --> @ns-bcr commented on GitHub (Jul 26, 2024): Ok so I wen't running another machine with ollama and still doesn't work. Weird.
Author
Owner

@ns-bcr commented on GitHub (Jul 26, 2024):

And now for sudo netstat -pant | grep :11434 I get : tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN 2564/ollama
Also, if I do curl 192.168.1.146:11434 it works and says Ollama is running but only on the server not from my computer.
I modified the file ollama.service and not for sudo netstat -pant | grep :11434 I get : tcp6 0 0 :::11434 :::* LISTEN 3073/ollama

<!-- gh-comment-id:2252423327 --> @ns-bcr commented on GitHub (Jul 26, 2024): And now for `sudo netstat -pant | grep :11434` I get : tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN 2564/ollama Also, if I do curl 192.168.1.146:11434 it works and says Ollama is running but only on the server not from my computer. I modified the file ollama.service and not for `sudo netstat -pant | grep :11434` I get : tcp6 0 0 :::11434 :::* LISTEN 3073/ollama
Author
Owner

@ns-bcr commented on GitHub (Jul 26, 2024):

Ok so I found out, I can't acces it somewhere else than my server so I made OpenWebUI be on the server have the localhost ip and port and not it works.
Thanks for your help!

<!-- gh-comment-id:2252449622 --> @ns-bcr commented on GitHub (Jul 26, 2024): Ok so I found out, I can't acces it somewhere else than my server so I made OpenWebUI be on the server have the localhost ip and port and not it works. Thanks for your help!
Author
Owner

@rice818 commented on GitHub (Oct 22, 2024):

Ok so I found out, I can't acces it somewhere else than my server so I made OpenWebUI be on the server have the localhost ip and port and not it works. Thanks for your help!

may I know how to "made OpenWebUI be on the server have the localhost ip and port"?

<!-- gh-comment-id:2428080797 --> @rice818 commented on GitHub (Oct 22, 2024): > Ok so I found out, I can't acces it somewhere else than my server so I made OpenWebUI be on the server have the localhost ip and port and not it works. Thanks for your help! may I know how to "made OpenWebUI be on the server have the localhost ip and port"?
Author
Owner

@rick-github commented on GitHub (Oct 22, 2024):

He installed open-webui on the same machine on which ollama was running, and configured open-webui to connect to http://127.0.0.1:11434.

<!-- gh-comment-id:2428821761 --> @rick-github commented on GitHub (Oct 22, 2024): He installed open-webui on the same machine on which ollama was running, and configured open-webui to connect to `http://127.0.0.1:11434`.
Author
Owner

@ClaireRovic commented on GitHub (Apr 23, 2025):

This worked for me, from https://github.com/ollama/ollama/blob/main/docs/faq.md. Using sudo ss -tulnp | grep 11434 it showed ollama was only listening on port 127.0.0.1.

Setting environment variables on Linux
If Ollama is run as a systemd service, environment variables should be set using systemctl:

Edit the systemd service by calling systemctl edit ollama.service. This will open an editor.

For each environment variable, add a line Environment under section [Service]: THIS NEEDS TO BE CREATED NEAR THE TOP

[Service]
Environment="OLLAMA_HOST=0.0.0.0:11434"

Save and exit.

Reload systemd and restart Ollama:

systemctl daemon-reload
systemctl restart ollama

<!-- gh-comment-id:2822908348 --> @ClaireRovic commented on GitHub (Apr 23, 2025): This worked for me, from https://github.com/ollama/ollama/blob/main/docs/faq.md. Using sudo ss -tulnp | grep 11434 it showed ollama was only listening on port 127.0.0.1. Setting environment variables on Linux If Ollama is run as a systemd service, environment variables should be set using systemctl: Edit the systemd service by calling **systemctl edit ollama.service**. This will open an editor. For each environment variable, add a line Environment under section [Service]: THIS NEEDS TO BE CREATED NEAR THE TOP [Service] Environment="OLLAMA_HOST=0.0.0.0:11434" Save and exit. Reload systemd and restart Ollama: systemctl daemon-reload systemctl restart ollama
Author
Owner

@glovacai commented on GitHub (Aug 8, 2025):

This worked for me, from https://github.com/ollama/ollama/blob/main/docs/faq.md. Using sudo ss -tulnp | grep 11434 it showed ollama was only listening on port 127.0.0.1.

Setting environment variables on Linux If Ollama is run as a systemd service, environment variables should be set using systemctl:

Edit the systemd service by calling systemctl edit ollama.service. This will open an editor.

For each environment variable, add a line Environment under section [Service]: THIS NEEDS TO BE CREATED NEAR THE TOP

[Service] Environment="OLLAMA_HOST=0.0.0.0:11434"

Save and exit.

Reload systemd and restart Ollama:

systemctl daemon-reload systemctl restart ollama

This is what fixed it for me.

<!-- gh-comment-id:3169225503 --> @glovacai commented on GitHub (Aug 8, 2025): > This worked for me, from https://github.com/ollama/ollama/blob/main/docs/faq.md. Using sudo ss -tulnp | grep 11434 it showed ollama was only listening on port 127.0.0.1. > > Setting environment variables on Linux If Ollama is run as a systemd service, environment variables should be set using systemctl: > > Edit the systemd service by calling **systemctl edit ollama.service**. This will open an editor. > > For each environment variable, add a line Environment under section [Service]: THIS NEEDS TO BE CREATED NEAR THE TOP > > [Service] Environment="OLLAMA_HOST=0.0.0.0:11434" > > Save and exit. > > Reload systemd and restart Ollama: > > systemctl daemon-reload systemctl restart ollama This is what fixed it for me.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3725