[GH-ISSUE #7128] Ollama host still be 127.0.0.1 while I have set OLLAMA_HOST = 0.0.0.0:11434 in the environment #4528

Closed
opened 2026-04-12 15:27:51 -05:00 by GiteaMirror · 9 comments
Owner

Originally created by @Yangshford on GitHub (Oct 8, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7128

What is the issue?

sorry for the bad english :(
i am not a native speaker

I run ollama on wsl2.
If I just use the commend "ollama serve" to start the ollama , I can't access to the ollama when I open the browser in windows and visit "localhost:11434".(ollama is accessable on wsl2)
I have set my environment in /etc/systemd/system/ollama.service but it doesn't work
1728368015006

However, it does work when I use the commend "export OLLAMA_HOST=0.0.0.0 ollama serve" before I start ollama.

what can I do if I want it just start at 0.0.0.0 as default.

OS

WSL2

GPU

Nvidia

CPU

Intel

Ollama version

0.3.12

Originally created by @Yangshford on GitHub (Oct 8, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7128 ### What is the issue? sorry for the bad english :( i am not a native speaker I run ollama on wsl2. If I just use the commend "ollama serve" to start the ollama , I can't access to the ollama when I open the browser in windows and visit "localhost:11434".(ollama is accessable on wsl2) I have set my environment in /etc/systemd/system/ollama.service but it doesn't work ![1728368015006](https://github.com/user-attachments/assets/af8363ef-829b-4a43-a684-f5530435605c) However, it does work when I use the commend "export OLLAMA_HOST=0.0.0.0 ollama serve" before I start ollama. what can I do if I want it just start at 0.0.0.0 as default. ### OS WSL2 ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.12
GiteaMirror added the bug label 2026-04-12 15:27:51 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 8, 2024):

Use systemctl start ollama. If you are running ollama serve from the command line, it will not use the variables you have assigned in /etc/systemd/system/ollama.service. If you run systemctl start ollama and ollama doesn't start or doesn't respond, run journalctl -u ollama --no-pager and post the result here.

<!-- gh-comment-id:2399446246 --> @rick-github commented on GitHub (Oct 8, 2024): Use `systemctl start ollama`. If you are running `ollama serve` from the command line, it will not use the variables you have assigned in `/etc/systemd/system/ollama.service`. If you run `systemctl start ollama` and ollama doesn't start or doesn't respond, run `journalctl -u ollama --no-pager` and post the result here.
Author
Owner

@Yangshford commented on GitHub (Oct 8, 2024):

Use systemctl start ollama. If you are running ollama serve from the command line, it will not use the variables you have assigned in /etc/systemd/system/ollama.service. If you run systemctl start ollama and ollama doesn't start or doesn't respond, run journalctl -u ollama --no-pager and post the result here.

thank you!

<!-- gh-comment-id:2399598236 --> @Yangshford commented on GitHub (Oct 8, 2024): > Use `systemctl start ollama`. If you are running `ollama serve` from the command line, it will not use the variables you have assigned in `/etc/systemd/system/ollama.service`. If you run `systemctl start ollama` and ollama doesn't start or doesn't respond, run `journalctl -u ollama --no-pager` and post the result here. thank you!
Author
Owner

@lzf00 commented on GitHub (Oct 24, 2024):

Use . If you are running from the command line, it will not use the variables you have assigned in . If you run and ollama doesn't start or doesn't respond, run and post the result here.systemctl start ollama``ollama serve``/etc/systemd/system/ollama.service``systemctl start ollama``journalctl -u ollama --no-pager

thank you!

请问这个命令具体是怎么运行的,我在linux部署的ollama(0.3.6)也遇到这个问题了,配置不起作用

<!-- gh-comment-id:2435394485 --> @lzf00 commented on GitHub (Oct 24, 2024): > > Use . If you are running from the command line, it will not use the variables you have assigned in . If you run and ollama doesn't start or doesn't respond, run and post the result here.`systemctl start ollama``ollama serve``/etc/systemd/system/ollama.service``systemctl start ollama``journalctl -u ollama --no-pager` > > thank you! 请问这个命令具体是怎么运行的,我在linux部署的ollama(0.3.6)也遇到这个问题了,配置不起作用
Author
Owner

@rick-github commented on GitHub (Oct 24, 2024):

At the command line, type in:

systemctl start ollama

If you receive errors, post them here. Also run

systemctl cat ollama

and add the output.

<!-- gh-comment-id:2435407980 --> @rick-github commented on GitHub (Oct 24, 2024): At the command line, type in: ``` systemctl start ollama ``` If you receive errors, post them here. Also run ``` systemctl cat ollama ``` and add the output.
Author
Owner

@ghost-909 commented on GitHub (Nov 1, 2024):

At the command line, type in:

systemctl start ollama

If you receive errors, post them here. Also run

systemctl cat ollama

and add the output.

I encounter with the some problem, but my ollama is running in the ubuntu with command line ./ollama serve
I have set OLLAMA_HOST = 0.0.0.0:11434 in the environment section as well
In this case, ollama running well and it can be visit with "localhost:11434" or "127.0.0.1:11434" , but can not be visited with the IPaddress:11434

If I run systemctl start ollama command line there is nothing happen( it looks like doesn't work)

the following are the "systemctl start ollama" command output
( is very long so I just show the latest output lines)

11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory
11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory
11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC
11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'.
11月 01 16:21:28 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 771.
11月 01 16:21:28 ghost-Precision-3660 systemd[1]: Stopped Ollama Service.
11月 01 16:21:29 ghost-Precision-3660 systemd[1]: Started Ollama Service.
11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory
11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory
11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC
11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'.
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 772.
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Stopped Ollama Service.
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Started Ollama Service.
11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory
11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'.

and the following are the "systemctl cat ollama" command output

ghost@ghost-Precision-3660:~$ systemctl cat ollama
# /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="OLLAMA_HOST=0.0.0.0"
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/snap/bin"
Environment="OLLAMA_MODELS=/home/ghost/ollama_model" 


[Install]
WantedBy=default.target

BTW: due to the big ollama install package, I download the install file and manually unzip file to install ollama rather than to run the online install script. I not sure this way is complete correctly

<!-- gh-comment-id:2451558793 --> @ghost-909 commented on GitHub (Nov 1, 2024): > At the command line, type in: > > ``` > systemctl start ollama > ``` > > If you receive errors, post them here. Also run > > ``` > systemctl cat ollama > ``` > > and add the output. I encounter with the some problem, but my ollama is running in the ubuntu with command line ./ollama serve I have set OLLAMA_HOST = 0.0.0.0:11434 in the environment section as well In this case, ollama running well and it can be visit with "localhost:11434" or "127.0.0.1:11434" , but can not be visited with the IPaddress:11434 If I run` systemctl start ollama` command line there is nothing happen( it looks like doesn't work) the following are the "systemctl start ollama" command output ( is very long so I just show the latest output lines) ``` 11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory 11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory 11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC 11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'. 11月 01 16:21:28 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 771. 11月 01 16:21:28 ghost-Precision-3660 systemd[1]: Stopped Ollama Service. 11月 01 16:21:29 ghost-Precision-3660 systemd[1]: Started Ollama Service. 11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory 11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory 11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC 11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'. 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 772. 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Stopped Ollama Service. 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Started Ollama Service. 11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory 11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'. ``` and the following are the "systemctl cat ollama" command output ``` ghost@ghost-Precision-3660:~$ systemctl cat ollama # /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="OLLAMA_HOST=0.0.0.0" Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/snap/bin" Environment="OLLAMA_MODELS=/home/ghost/ollama_model" [Install] WantedBy=default.target ``` BTW: due to the big ollama install package, I download the install file and manually unzip file to install ollama rather than to run the online install script. I not sure this way is complete correctly
Author
Owner

@ghost-909 commented on GitHub (Nov 1, 2024):

At the command line, type in:

systemctl start ollama

If you receive errors, post them here. Also run

systemctl cat ollama

and add the output.

I encounter with the some problem, but my ollama is running in the ubuntu with command line ./ollama serve
I have set OLLAMA_HOST = 0.0.0.0:11434 in the environment section as well
In this case, ollama running well and it can be visit with "localhost:11434" or "127.0.0.1:11434" , but can not be visited with the IPaddress:11434

If I run systemctl start ollama command line there is nothing happen( it looks like doesn't work)

the following are the "systemctl start ollama" command output
( is very long so I just show the latest output lines)

11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory
11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory
11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC
11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'.
11月 01 16:21:28 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 771.
11月 01 16:21:28 ghost-Precision-3660 systemd[1]: Stopped Ollama Service.
11月 01 16:21:29 ghost-Precision-3660 systemd[1]: Started Ollama Service.
11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory
11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory
11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC
11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'.
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 772.
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Stopped Ollama Service.
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Started Ollama Service.
11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory
11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'.

and the following are the "systemctl cat ollama" command output

ghost@ghost-Precision-3660:~$ systemctl cat ollama
# /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="OLLAMA_HOST=0.0.0.0"
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/snap/bin"
Environment="OLLAMA_MODELS=/home/ghost/ollama_model" 


[Install]
WantedBy=default.target

BTW: due to the big ollama install package, I download the install file and manually unzip file to install ollama rather than to run the online install script. I not sure this way is complete correctly

<!-- gh-comment-id:2451561234 --> @ghost-909 commented on GitHub (Nov 1, 2024): > At the command line, type in: > > ``` > systemctl start ollama > ``` > > If you receive errors, post them here. Also run > > ``` > systemctl cat ollama > ``` > > and add the output. I encounter with the some problem, but my ollama is running in the ubuntu with command line ./ollama serve I have set OLLAMA_HOST = 0.0.0.0:11434 in the environment section as well In this case, ollama running well and it can be visit with "localhost:11434" or "127.0.0.1:11434" , but can not be visited with the IPaddress:11434 If I run` systemctl start ollama` command line there is nothing happen( it looks like doesn't work) the following are the "systemctl start ollama" command output ( is very long so I just show the latest output lines) ``` 11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory 11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory 11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC 11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'. 11月 01 16:21:28 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 771. 11月 01 16:21:28 ghost-Precision-3660 systemd[1]: Stopped Ollama Service. 11月 01 16:21:29 ghost-Precision-3660 systemd[1]: Started Ollama Service. 11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory 11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory 11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC 11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'. 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 772. 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Stopped Ollama Service. 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Started Ollama Service. 11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory 11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'. ``` and the following are the "systemctl cat ollama" command output ``` ghost@ghost-Precision-3660:~$ systemctl cat ollama # /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="OLLAMA_HOST=0.0.0.0" Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/snap/bin" Environment="OLLAMA_MODELS=/home/ghost/ollama_model" [Install] WantedBy=default.target ``` BTW: due to the big ollama install package, I download the install file and manually unzip file to install ollama rather than to run the online install script. I not sure this way is complete correctly
Author
Owner

@rick-github commented on GitHub (Nov 1, 2024):

If you are running the server with ./ollama serve, it's not using the environment variables from the service configuration file. Do the following instead:

$ export OLLAMA_HOST=0.0.0.0:11434
$ ./ollama serve

Since you are not using the service, you can also stop systemd from trying to start it, which is why you are getting errors from systemctl start ollama:

$ sudo systemctl stop ollama
<!-- gh-comment-id:2451567480 --> @rick-github commented on GitHub (Nov 1, 2024): If you are running the server with `./ollama serve`, it's not using the environment variables from the service configuration file. Do the following instead: ``` $ export OLLAMA_HOST=0.0.0.0:11434 $ ./ollama serve ``` Since you are not using the service, you can also stop systemd from trying to start it, which is why you are getting errors from `systemctl start ollama`: ``` $ sudo systemctl stop ollama ```
Author
Owner

@ghost-909 commented on GitHub (Nov 1, 2024):

At the command line, type in:

systemctl start ollama

If you receive errors, post them here. Also run

systemctl cat ollama

and add the output.

I encounter with the some problem, but my ollama is running in the ubuntu with command line ./ollama serve
I have set OLLAMA_HOST = 0.0.0.0:11434 in the environment section as well
In this case, ollama running well and it can be visit with "localhost:11434" or "127.0.0.1:11434" , but can not be visited with the IPaddress:11434

If I run systemctl start ollama command line there is nothing happen( it looks like doesn't work)

the following are the "systemctl start ollama" command output
( is very long so I just show the latest output lines)

11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory
11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory
11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC
11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'.
11月 01 16:21:28 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 771.
11月 01 16:21:28 ghost-Precision-3660 systemd[1]: Stopped Ollama Service.
11月 01 16:21:29 ghost-Precision-3660 systemd[1]: Started Ollama Service.
11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory
11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory
11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC
11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'.
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 772.
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Stopped Ollama Service.
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Started Ollama Service.
11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory
11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC
11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'.

and the following are the "systemctl cat ollama" command output

ghost@ghost-Precision-3660:~$ systemctl cat ollama
# /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="OLLAMA_HOST=0.0.0.0"
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/snap/bin"
Environment="OLLAMA_MODELS=/home/ghost/ollama_model" 


[Install]
WantedBy=default.target

BTW: due to the big ollama install package, I download the install file and manually unzip file to install ollama rather than to run the online install script. I not sure this way is complete correctly

<!-- gh-comment-id:2451579683 --> @ghost-909 commented on GitHub (Nov 1, 2024): > At the command line, type in: > > ``` > systemctl start ollama > ``` > > If you receive errors, post them here. Also run > > ``` > systemctl cat ollama > ``` > > and add the output. I encounter with the some problem, but my ollama is running in the ubuntu with command line ./ollama serve I have set OLLAMA_HOST = 0.0.0.0:11434 in the environment section as well In this case, ollama running well and it can be visit with "localhost:11434" or "127.0.0.1:11434" , but can not be visited with the IPaddress:11434 If I run` systemctl start ollama` command line there is nothing happen( it looks like doesn't work) the following are the "systemctl start ollama" command output ( is very long so I just show the latest output lines) ``` 11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory 11月 01 16:21:25 ghost-Precision-3660 systemd[12734]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory 11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC 11月 01 16:21:25 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'. 11月 01 16:21:28 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 771. 11月 01 16:21:28 ghost-Precision-3660 systemd[1]: Stopped Ollama Service. 11月 01 16:21:29 ghost-Precision-3660 systemd[1]: Started Ollama Service. 11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory 11月 01 16:21:29 ghost-Precision-3660 systemd[12735]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory 11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC 11月 01 16:21:29 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'. 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Scheduled restart job, restart counter is at 772. 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Stopped Ollama Service. 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: Started Ollama Service. 11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed to locate executable /usr/local/bin/ollama: Is a directory 11月 01 16:21:32 ghost-Precision-3660 systemd[12752]: ollama.service: Failed at step EXEC spawning /usr/local/bin/ollama: Is a directory 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Main process exited, code=exited, status=203/EXEC 11月 01 16:21:32 ghost-Precision-3660 systemd[1]: ollama.service: Failed with result 'exit-code'. ``` and the following are the "systemctl cat ollama" command output ``` ghost@ghost-Precision-3660:~$ systemctl cat ollama # /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="OLLAMA_HOST=0.0.0.0" Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/snap/bin" Environment="OLLAMA_MODELS=/home/ghost/ollama_model" [Install] WantedBy=default.target ``` BTW: due to the big ollama install package, I download the install file and manually unzip file to install ollama rather than to run the online install script. I not sure this way is complete correctly
Author
Owner

@rick-github commented on GitHub (Nov 1, 2024):

If you are running the server with ./ollama serve, it's not using the environment variables from the service configuration file. Do the following instead:

$ export OLLAMA_HOST=0.0.0.0:11434
$ ./ollama serve

Since you are not using the service, you can also stop systemd from trying to start it, which is why you are getting errors from systemctl start ollama:

$ sudo systemctl stop ollama
<!-- gh-comment-id:2451581793 --> @rick-github commented on GitHub (Nov 1, 2024): If you are running the server with `./ollama serve`, it's not using the environment variables from the service configuration file. Do the following instead: ``` $ export OLLAMA_HOST=0.0.0.0:11434 $ ./ollama serve ``` Since you are not using the service, you can also stop systemd from trying to start it, which is why you are getting errors from `systemctl start ollama`: ``` $ sudo systemctl stop ollama ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4528