[GH-ISSUE #4263] Unable to bind of the private ec2 instance ip in ollama service file to restrict the access #2663

Closed
opened 2026-04-12 13:00:06 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @devivaraprasad901 on GitHub (May 8, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4263

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

Unable to bind of the private ec2 instance ip in ollama service file to restrict the access ,which is using private VPC
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="OLLAMA_HOST=<private ec2 instance ip(x.x.x.x>"

[Install]
WantedBy=default.target
~
~
~
~
~
~
~
~
~
~
~
~
"/etc/systemd/system/ollama.service" [readonly] 14L, 238B

When I start the service it is not coming upsudo systemctl status ollama
● ollama.service - Ollama Service
Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: enabled)
Active: activating (auto-restart) (Result: exit-code) since Wed 2024-05-08 18:31:20 UTC; 2s ago
Process: 311875 ExecStart=/usr/local/bin/ollama serve (code=exited, status=1/FAILURE)
Main PID: 311875 (code=exited, status=1/FAILURE)
CPU: 13ms

    SYSLOG:
    2024-05-08T18:06:46.719183+00:00 ip-xx-xx-x-xxsystemd[1]: Started ollama.service - Ollama Service.

2024-05-08T18:06:46.729714+00:00 ip-xx-xx-xx-xx ollama[307072]: Error: listen tcp x.x.x.x:11434: bind: cannot assign requested address
2024-05-08T18:06:46.731993+00:00 ip-xx-xxx-x-xxsystemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
2024-05-08T18:06:46.732178+00:00 ip-x-x-x-xsystemd[1]: ollama.service: Failed with result 'exit-code'.

I am having two ec2 instances :
One instance having ollama installed with LLAMA3 model running
Another Ec2 instance having LLM inference code , which store access keys

Application code is hosted on another 3rd EC2 instance which access the LLAMA3 instance using LLM inference keys.

Here my goal is to set the LLM inference private IP in LAMA3 EC2 instance under OLLAM service file.

OS

Linux

GPU

Nvidia

CPU

AMD

Ollama version

ollama version is 0.1.32

Originally created by @devivaraprasad901 on GitHub (May 8, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4263 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? Unable to bind of the private ec2 instance ip in ollama service file to restrict the access ,which is using private VPC [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="OLLAMA_HOST=<private ec2 instance ip(x.x.x.x>" [Install] WantedBy=default.target ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ "/etc/systemd/system/ollama.service" [readonly] 14L, 238B When I start the service it is not coming upsudo systemctl status ollama ● ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: enabled) Active: activating (auto-restart) (Result: exit-code) since Wed 2024-05-08 18:31:20 UTC; 2s ago Process: 311875 ExecStart=/usr/local/bin/ollama serve (code=exited, status=1/FAILURE) Main PID: 311875 (code=exited, status=1/FAILURE) CPU: 13ms SYSLOG: 2024-05-08T18:06:46.719183+00:00 ip-xx-xx-x-xxsystemd[1]: Started ollama.service - Ollama Service. 2024-05-08T18:06:46.729714+00:00 ip-xx-xx-xx-xx ollama[307072]: Error: listen tcp x.x.x.x:11434: bind: cannot assign requested address 2024-05-08T18:06:46.731993+00:00 ip-xx-xxx-x-xxsystemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE 2024-05-08T18:06:46.732178+00:00 ip-x-x-x-xsystemd[1]: ollama.service: Failed with result 'exit-code'. I am having two ec2 instances : One instance having ollama installed with LLAMA3 model running Another Ec2 instance having LLM inference code , which store access keys Application code is hosted on another 3rd EC2 instance which access the LLAMA3 instance using LLM inference keys. Here my goal is to set the LLM inference private IP in LAMA3 EC2 instance under OLLAM service file. ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version ollama version is 0.1.32
GiteaMirror added the question label 2026-04-12 13:00:06 -05:00
Author
Owner

@dhiltgen commented on GitHub (Jul 25, 2024):

I think this is a configuration error. I'm able to successfully create 2 EC2 VMs on a private network and use Ollama between the hosts.

My /etc/systemd/system/ollama.service looks like this:

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
Environment="OLLAMA_HOST=XX.XX.XX.XX:11434"

[Install]
WantedBy=default.target

After doing the daemon reload and restarting the service, I can confirm it is listening on the expected private IP

% journalctl -u ollama | grep "Listening on" | tail
Jul 25 17:22:29 ip-XX-XX-XX-XX ollama[1547]: time=2024-07-25T17:22:29.564Z level=INFO source=routes.go:1147 msg="Listening on XX.XX.XX.XX:11434 (version 0.2.8)"

And both VMs are in a security group where I've opened inbound traffic for port 11434 from the local subnet.

<!-- gh-comment-id:2251043539 --> @dhiltgen commented on GitHub (Jul 25, 2024): I think this is a configuration error. I'm able to successfully create 2 EC2 VMs on a private network and use Ollama between the hosts. My `/etc/systemd/system/ollama.service` looks like this: ``` [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin" Environment="OLLAMA_HOST=XX.XX.XX.XX:11434" [Install] WantedBy=default.target ``` After doing the daemon reload and restarting the service, I can confirm it is listening on the expected private IP ``` % journalctl -u ollama | grep "Listening on" | tail Jul 25 17:22:29 ip-XX-XX-XX-XX ollama[1547]: time=2024-07-25T17:22:29.564Z level=INFO source=routes.go:1147 msg="Listening on XX.XX.XX.XX:11434 (version 0.2.8)" ``` And both VMs are in a security group where I've opened inbound traffic for port 11434 from the local subnet.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2663