[GH-ISSUE #8960] Remote Access #5813

Closed
opened 2026-04-12 17:09:13 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @LOUYWEI on GitHub (Feb 9, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8960

What is the issue?

I modified the file /etc/systemd/system/ollama.service and add:

[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"

and implement

systemctl daemon-reload
systemctl restart ollama

However, the remotely deployed model cannot be found locally, how can i do ?

a part of rules like:

1、sudo iptables -L -v -n

Chain INPUT (policy ACCEPT 18685 packets, 2779K bytes)
pkts bytes target prot opt in out source destination
205K 11M ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:11434

2、netstat -tuln | grep 11434

tcp6 0 0 :::11434 :::* LISTEN

Relevant log output


OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.5.7

Originally created by @LOUYWEI on GitHub (Feb 9, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8960 ### What is the issue? I modified the file /etc/systemd/system/ollama.service and add: [Service] Environment="OLLAMA_HOST=0.0.0.0" Environment="OLLAMA_ORIGINS=*" and implement systemctl daemon-reload systemctl restart ollama However, the remotely deployed model cannot be found locally, how can i do ? ------------------------------------------------------------------------------------------------------------------------- a part of rules like: 1、sudo iptables -L -v -n Chain INPUT (policy ACCEPT 18685 packets, 2779K bytes) pkts bytes target prot opt in out source destination 205K 11M ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:11434 2、netstat -tuln | grep 11434 tcp6 0 0 :::11434 :::* LISTEN ### Relevant log output ```shell ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.5.7
GiteaMirror added the bug label 2026-04-12 17:09:13 -05:00
Author
Owner

@mxyng commented on GitHub (Feb 10, 2025):

Can you confirm Ollama is accessible on the host system on the addresses you expect?

e.g.

nc -vz 127.0.0.1 11434

or

nc -vz <your-host-ip> 11434

If it is, it would suggest it's a networking, not Ollama, issue

<!-- gh-comment-id:2649272210 --> @mxyng commented on GitHub (Feb 10, 2025): Can you confirm Ollama is accessible on the host system on the addresses you expect? e.g. ``` nc -vz 127.0.0.1 11434 ``` or ``` nc -vz <your-host-ip> 11434 ``` If it is, it would suggest it's a networking, not Ollama, issue
Author
Owner

@rick-github commented on GitHub (Feb 10, 2025):

What's the result of

cat /proc/sys/net/ipv6/bindv6only
<!-- gh-comment-id:2649503306 --> @rick-github commented on GitHub (Feb 10, 2025): What's the result of ``` cat /proc/sys/net/ipv6/bindv6only ```
Author
Owner

@LOUYWEI commented on GitHub (Feb 12, 2025):

Can you confirm Ollama is accessible on the host system on the addresses you expect?

e.g.

nc -vz 127.0.0.1 11434

or

nc -vz <your-host-ip> 11434

If it is, it would suggest it's a networking, not Ollama, issue

i try the command :

nc -vz 127.0.0.1 11434

and the result is : Connection to 127.0.0.1 11434 port [tcp/*] succeeded!

<!-- gh-comment-id:2652707946 --> @LOUYWEI commented on GitHub (Feb 12, 2025): > Can you confirm Ollama is accessible on the host system on the addresses you expect? > > e.g. > > ``` > nc -vz 127.0.0.1 11434 > ``` > > or > > ``` > nc -vz <your-host-ip> 11434 > ``` > > If it is, it would suggest it's a networking, not Ollama, issue i try the command : `nc -vz 127.0.0.1 11434 ` and the result is : **Connection to 127.0.0.1 11434 port [tcp/*] succeeded!**
Author
Owner

@LOUYWEI commented on GitHub (Feb 12, 2025):

What's the result of

cat /proc/sys/net/ipv6/bindv6only

the result is 0, not 1

<!-- gh-comment-id:2652709340 --> @LOUYWEI commented on GitHub (Feb 12, 2025): > What's the result of > > ``` > cat /proc/sys/net/ipv6/bindv6only > ``` the result is 0, not 1
Author
Owner

@mxyng commented on GitHub (Feb 12, 2025):

It seems Ollama is accessible on the localhost. Since netstat reports it's accessible on 0.0.0.0, it would suggest it's networking issue and unrelated to Ollama

<!-- gh-comment-id:2654374293 --> @mxyng commented on GitHub (Feb 12, 2025): It seems Ollama is accessible on the localhost. Since `netstat` reports it's accessible on 0.0.0.0, it would suggest it's networking issue and unrelated to Ollama
Author
Owner

@LOUYWEI commented on GitHub (Feb 13, 2025):

It seems Ollama is accessible on the localhost. Since netstat reports it's accessible on 0.0.0.0, it would suggest it's networking issue and unrelated to Ollama

From the netstat result, I see that it is only listening to the ipv6 address. Is this the reason why I cannot access ollama remotely? Do you know how to change it to listen to ipv4 at the same time? I tried to change it but it didn't work. Thank you

<!-- gh-comment-id:2655161062 --> @LOUYWEI commented on GitHub (Feb 13, 2025): > It seems Ollama is accessible on the localhost. Since `netstat` reports it's accessible on 0.0.0.0, it would suggest it's networking issue and unrelated to Ollama From the `netstat` result, I see that it is only listening to the ipv6 address. Is this the reason why I cannot access ollama remotely? Do you know how to change it to listen to ipv4 at the same time? I tried to change it but it didn't work. Thank you
Author
Owner

@mxyng commented on GitHub (Feb 13, 2025):

nc -vz 127.0.0.1 11434
and the result is : Connection to 127.0.0.1 11434 port [tcp/*] succeeded!

You confirmed it's serving on 127.0.0.1 so that shouldn't be the issue. Check your firewalls, port forwarding, etc.

<!-- gh-comment-id:2655274308 --> @mxyng commented on GitHub (Feb 13, 2025): > `nc -vz 127.0.0.1 11434` > and the result is : Connection to 127.0.0.1 11434 port [tcp/*] succeeded! You confirmed it's serving on 127.0.0.1 so that shouldn't be the issue. Check your firewalls, port forwarding, etc.
Author
Owner

@pminimd commented on GitHub (Feb 25, 2025):

What is the issue?

I modified the file /etc/systemd/system/ollama.service and add:

[Service] Environment="OLLAMA_HOST=0.0.0.0" Environment="OLLAMA_ORIGINS=*"

and implement

systemctl daemon-reload systemctl restart ollama

However, the remotely deployed model cannot be found locally, how can i do ?

a part of rules like:

1、sudo iptables -L -v -n

Chain INPUT (policy ACCEPT 18685 packets, 2779K bytes) pkts bytes target prot opt in out source destination 205K 11M ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:11434

2、netstat -tuln | grep 11434

tcp6 0 0 :::11434 :::* LISTEN

Relevant log output

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.5.7

0.5.11 meets the same issue

<!-- gh-comment-id:2680350131 --> @pminimd commented on GitHub (Feb 25, 2025): > ### What is the issue? > I modified the file /etc/systemd/system/ollama.service and add: > > [Service] Environment="OLLAMA_HOST=0.0.0.0" Environment="OLLAMA_ORIGINS=*" > > and implement > > systemctl daemon-reload systemctl restart ollama > > ## However, the remotely deployed model cannot be found locally, how can i do ? > a part of rules like: > > 1、sudo iptables -L -v -n > > Chain INPUT (policy ACCEPT 18685 packets, 2779K bytes) pkts bytes target prot opt in out source destination 205K 11M ACCEPT tcp -- * * 0.0.0.0/0 0.0.0.0/0 tcp dpt:11434 > > 2、netstat -tuln | grep 11434 > > tcp6 0 0 :::11434 :::* LISTEN > > ### Relevant log output > ### OS > Linux > > ### GPU > Nvidia > > ### CPU > Intel > > ### Ollama version > 0.5.7 0.5.11 meets the same issue
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5813