[GH-ISSUE #6248] Error: could not connect to ollama app, is it running? #29668

Closed
opened 2026-04-22 08:45:04 -05:00 by GiteaMirror · 20 comments
Owner

Originally created by @zzxgraph on GitHub (Aug 8, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6248

What is the issue?

(Llama3.1) [zzx@master ~]$ sudo journalctl -e -u ollama
[sudo] password for zzx:
Sorry, try again.
[sudo] password for zzx:
Aug 08 09:02:29 master systemd[1]: ollama.service failed.
Aug 08 09:02:33 master systemd[1]: ollama.service holdoff time over, scheduling restart.
Aug 08 09:02:33 master systemd[1]: Stopped Ollama Service.
Aug 08 09:02:33 master systemd[1]: Started Ollama Service.
Aug 08 09:02:33 master systemd[1]: ollama.service: main process exited, code=exited, status=127/n/a
Aug 08 09:02:33 master systemd[1]: Unit ollama.service entered failed state.
Aug 08 09:02:33 master systemd[1]: ollama.service failed.
Aug 08 09:02:36 master systemd[1]: ollama.service holdoff time over, scheduling restart.
Aug 08 09:02:36 master systemd[1]: Stopped Ollama Service.
Aug 08 09:02:36 master systemd[1]: Started Ollama Service.
Aug 08 09:02:36 master systemd[1]: ollama.service: main process exited, code=exited, status=127/n/a
Aug 08 09:02:36 master systemd[1]: Unit ollama.service entered failed state.
Aug 08 09:02:36 master systemd[1]: ollama.service failed.
Aug 08 09:02:39 master systemd[1]: ollama.service holdoff time over, scheduling restart.
Aug 08 09:02:39 master systemd[1]: Stopped Ollama Service.
Aug 08 09:02:39 master systemd[1]: Started Ollama Service.

OS

Linux

GPU

Nvidia

CPU

No response

Ollama version

(Llama3.1) [zzx@master ~]$ ollama --version Warning: could not connect to a running Ollama instance Warning: client version is 0.3.3

Originally created by @zzxgraph on GitHub (Aug 8, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6248 ### What is the issue? (Llama3.1) [zzx@master ~]$ sudo journalctl -e -u ollama [sudo] password for zzx: Sorry, try again. [sudo] password for zzx: Aug 08 09:02:29 master systemd[1]: ollama.service failed. Aug 08 09:02:33 master systemd[1]: ollama.service holdoff time over, scheduling restart. Aug 08 09:02:33 master systemd[1]: Stopped Ollama Service. Aug 08 09:02:33 master systemd[1]: Started Ollama Service. Aug 08 09:02:33 master systemd[1]: ollama.service: main process exited, code=exited, status=127/n/a Aug 08 09:02:33 master systemd[1]: Unit ollama.service entered failed state. Aug 08 09:02:33 master systemd[1]: ollama.service failed. Aug 08 09:02:36 master systemd[1]: ollama.service holdoff time over, scheduling restart. Aug 08 09:02:36 master systemd[1]: Stopped Ollama Service. Aug 08 09:02:36 master systemd[1]: Started Ollama Service. Aug 08 09:02:36 master systemd[1]: ollama.service: main process exited, code=exited, status=127/n/a Aug 08 09:02:36 master systemd[1]: Unit ollama.service entered failed state. Aug 08 09:02:36 master systemd[1]: ollama.service failed. Aug 08 09:02:39 master systemd[1]: ollama.service holdoff time over, scheduling restart. Aug 08 09:02:39 master systemd[1]: Stopped Ollama Service. Aug 08 09:02:39 master systemd[1]: Started Ollama Service. ### OS Linux ### GPU Nvidia ### CPU _No response_ ### Ollama version (Llama3.1) [zzx@master ~]$ ollama --version Warning: could not connect to a running Ollama instance Warning: client version is 0.3.3
GiteaMirror added the bug label 2026-04-22 08:45:04 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 8, 2024):

How did you install ollama? What's the contents of /etc/systemd/system/ollama.service? What's the output of which ollama?

<!-- gh-comment-id:2275605511 --> @rick-github commented on GitHub (Aug 8, 2024): How did you install ollama? What's the contents of `/etc/systemd/system/ollama.service`? What's the output of `which ollama`?
Author
Owner

@zzxgraph commented on GitHub (Aug 9, 2024):

1.Install it by command:

curl -fsSL https://ollama.com/install.sh | sh

2.sudo vim /etc/systemd/system/ollama.service

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/home/zzx/.conda/envs/Llama3.1/bin:/usr/local/anaconda3/condabin:/usr/local/anaconda3/bin:/opt/rh/devtoolset-8/root/usr/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/local/cuda-11.8/bin:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.382.b05-1.el7_9.aarch64/jre/bin:/data/hadoop/app/bin:/data/hadoop/app/sbin:/usr/local/hive/bin:/home/zzx/.local/bin:/home/zzx/bin"

[Install]
WantedBy=default.target

3.(Llama3.1) [zzx@master ~]$ which ollama
/usr/local/bin/ollama

<!-- gh-comment-id:2276944910 --> @zzxgraph commented on GitHub (Aug 9, 2024): 1.Install it by command: curl -fsSL https://ollama.com/install.sh | sh 2.sudo vim /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/home/zzx/.conda/envs/Llama3.1/bin:/usr/local/anaconda3/condabin:/usr/local/anaconda3/bin:/opt/rh/devtoolset-8/root/usr/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/local/cuda-11.8/bin:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.382.b05-1.el7_9.aarch64/jre/bin:/data/hadoop/app/bin:/data/hadoop/app/sbin:/usr/local/hive/bin:/home/zzx/.local/bin:/home/zzx/bin" [Install] WantedBy=default.target 3.(Llama3.1) [zzx@master ~]$ which ollama /usr/local/bin/ollama
Author
Owner

@rick-github commented on GitHub (Aug 9, 2024):

This looks normal. What happens if you run sudo -u ollama -g ollama /usr/local/bin/ollama serve?

<!-- gh-comment-id:2277149210 --> @rick-github commented on GitHub (Aug 9, 2024): This looks normal. What happens if you run `sudo -u ollama -g ollama /usr/local/bin/ollama serve`?
Author
Owner

@zzxgraph commented on GitHub (Aug 9, 2024):

This looks normal. What happens if you run sudo -u ollama -g ollama /usr/local/bin/ollama serve?
(Llama3.1) [zzx@master ~]$ sudo -u ollama -g ollama /usr/local/bin/ollama serve
/var/tmp/sclw2gJXn: line 8: -u: command not found

<!-- gh-comment-id:2277226989 --> @zzxgraph commented on GitHub (Aug 9, 2024): > This looks normal. What happens if you run `sudo -u ollama -g ollama /usr/local/bin/ollama serve`? (Llama3.1) [zzx@master ~]$ sudo -u ollama -g ollama /usr/local/bin/ollama serve /var/tmp/sclw2gJXn: line 8: -u: command not found
Author
Owner

@rick-github commented on GitHub (Aug 9, 2024):

What's the output of these commands:
uname -a
sudo su - ollama -s /bin/bash -c '/usr/local/bin/ollama serve'

<!-- gh-comment-id:2277239874 --> @rick-github commented on GitHub (Aug 9, 2024): What's the output of these commands: `uname -a` `sudo su - ollama -s /bin/bash -c '/usr/local/bin/ollama serve'`
Author
Owner

@zzxgraph commented on GitHub (Aug 9, 2024):

(Llama3.1) [zzx@master ~]$ uname -a
Linux master 4.18.0-348.20.1.el7.aarch64 #1 SMP Wed Apr 13 20:57:50 UTC 2022 aarch64 aarch64 aarch64 GNU/Linux
(Llama3.1) [zzx@master ~]$ sudo su - ollama -s /bin/bash -c '/usr/local/bin/ollama serve'
/etc/locale.conf: line 2: warning: setlocale: LC_ALL: cannot change locale (zh_CN.UTF-8)
2024/08/09 14:41:45 routes.go:1108: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-08-09T14:41:45.844+08:00 level=INFO source=images.go:781 msg="total blobs: 0"
time=2024-08-09T14:41:45.844+08:00 level=INFO source=images.go:788 msg="total unused blobs removed: 0"
time=2024-08-09T14:41:45.844+08:00 level=INFO source=routes.go:1155 msg="Listening on 127.0.0.1:11434 (version 0.3.3)"
time=2024-08-09T14:41:45.845+08:00 level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama1798133808/runners
time=2024-08-09T14:41:50.696+08:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cuda_v11]"
time=2024-08-09T14:41:50.696+08:00 level=INFO source=gpu.go:205 msg="looking for compatible GPUs"
time=2024-08-09T14:41:54.260+08:00 level=INFO source=types.go:105 msg="inference compute" id=GPU-0b7a2869-735d-f3bf-289e-dc7e31bbf8f8 library=cuda compute=8.0 driver=12.4 name="NVIDIA A100-PCIE-40GB" total="39.4 GiB" available="39.0 GiB"
time=2024-08-09T14:41:54.260+08:00 level=INFO source=types.go:105 msg="inference compute" id=GPU-b373e06d-c29c-cec3-2677-958edb72f143 library=cuda compute=8.0 driver=12.4 name="NVIDIA A100-PCIE-40GB" total="39.4 GiB" available="39.0 GiB"

<!-- gh-comment-id:2277257892 --> @zzxgraph commented on GitHub (Aug 9, 2024): (Llama3.1) [zzx@master ~]$ uname -a Linux master 4.18.0-348.20.1.el7.aarch64 #1 SMP Wed Apr 13 20:57:50 UTC 2022 aarch64 aarch64 aarch64 GNU/Linux (Llama3.1) [zzx@master ~]$ sudo su - ollama -s /bin/bash -c '/usr/local/bin/ollama serve' /etc/locale.conf: line 2: warning: setlocale: LC_ALL: cannot change locale (zh_CN.UTF-8) 2024/08/09 14:41:45 routes.go:1108: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]" time=2024-08-09T14:41:45.844+08:00 level=INFO source=images.go:781 msg="total blobs: 0" time=2024-08-09T14:41:45.844+08:00 level=INFO source=images.go:788 msg="total unused blobs removed: 0" time=2024-08-09T14:41:45.844+08:00 level=INFO source=routes.go:1155 msg="Listening on 127.0.0.1:11434 (version 0.3.3)" time=2024-08-09T14:41:45.845+08:00 level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama1798133808/runners time=2024-08-09T14:41:50.696+08:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cuda_v11]" time=2024-08-09T14:41:50.696+08:00 level=INFO source=gpu.go:205 msg="looking for compatible GPUs" time=2024-08-09T14:41:54.260+08:00 level=INFO source=types.go:105 msg="inference compute" id=GPU-0b7a2869-735d-f3bf-289e-dc7e31bbf8f8 library=cuda compute=8.0 driver=12.4 name="NVIDIA A100-PCIE-40GB" total="39.4 GiB" available="39.0 GiB" time=2024-08-09T14:41:54.260+08:00 level=INFO source=types.go:105 msg="inference compute" id=GPU-b373e06d-c29c-cec3-2677-958edb72f143 library=cuda compute=8.0 driver=12.4 name="NVIDIA A100-PCIE-40GB" total="39.4 GiB" available="39.0 GiB"
Author
Owner

@rick-github commented on GitHub (Aug 9, 2024):

sudo systemctl stop ollama
sudo systemctl daemon-reload
sudo systemctl start ollama
sudo systemctl status ollama
<!-- gh-comment-id:2277270519 --> @rick-github commented on GitHub (Aug 9, 2024): ``` sudo systemctl stop ollama sudo systemctl daemon-reload sudo systemctl start ollama sudo systemctl status ollama ```
Author
Owner

@zzxgraph commented on GitHub (Aug 9, 2024):

(Llama3.1) [zzx@master ~]$ sudo systemctl stop ollama
[sudo] password for zzx:
(Llama3.1) [zzx@master ~]$ sudo systemctl daemon-reload
(Llama3.1) [zzx@master ~]$ sudo systemctl start ollama
(Llama3.1) [zzx@master ~]$ sudo systemctl status ollama
● ollama.service - Ollama Service
Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Fri 2024-08-09 15:02:38 CST; 2s ago
Process: 14112 ExecStart=/usr/local/bin/ollama serve (code=exited, status=127)
Main PID: 14112 (code=exited, status=127)

Aug 09 15:02:38 master systemd[1]: Unit ollama.service entered failed state.
Aug 09 15:02:38 master systemd[1]: ollama.service failed.

emm......it seems that it still doesn't work

<!-- gh-comment-id:2277282439 --> @zzxgraph commented on GitHub (Aug 9, 2024): (Llama3.1) [zzx@master ~]$ sudo systemctl stop ollama [sudo] password for zzx: (Llama3.1) [zzx@master ~]$ sudo systemctl daemon-reload (Llama3.1) [zzx@master ~]$ sudo systemctl start ollama (Llama3.1) [zzx@master ~]$ sudo systemctl status ollama ● ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: disabled) Active: activating (auto-restart) (Result: exit-code) since Fri 2024-08-09 15:02:38 CST; 2s ago Process: 14112 ExecStart=/usr/local/bin/ollama serve (code=exited, status=127) Main PID: 14112 (code=exited, status=127) Aug 09 15:02:38 master systemd[1]: Unit ollama.service entered failed state. Aug 09 15:02:38 master systemd[1]: ollama.service failed. emm......it seems that it still doesn't work
Author
Owner

@rick-github commented on GitHub (Aug 9, 2024):

status=127 means that the program wasn't found, but sudo su - ollama -s /bin/bash -c '/usr/local/bin/ollama serve' started it. I thought it possible that systemd had an old version of the service file, so tried systemctl daemon-reload to refresh, but that didn't work. So all we know is that systemd can't find the file. What does ls -ld /usr /usr/local /usr/local/bin show?

<!-- gh-comment-id:2277296638 --> @rick-github commented on GitHub (Aug 9, 2024): status=127 means that the program wasn't found, but `sudo su - ollama -s /bin/bash -c '/usr/local/bin/ollama serve'` started it. I thought it possible that systemd had an old version of the service file, so tried `systemctl daemon-reload` to refresh, but that didn't work. So all we know is that systemd can't find the file. What does `ls -ld /usr /usr/local /usr/local/bin` show?
Author
Owner

@zzxgraph commented on GitHub (Aug 9, 2024):

(Llama3.1) [zzx@master ~]$ ls -ld /usr /usr/local /usr/local/bin
drwxr-xr-x. 14 root root 166 Jul 11 2023 /usr
drwxr-xr-x. 23 root root 4096 Jul 31 15:49 /usr/local
drwxr-xr-x. 2 root root 172 Aug 7 16:57 /usr/local/bin

I encountered this problem when I first installed it and tried uninstalling and reinstalling, but the problem still exists.

<!-- gh-comment-id:2277317976 --> @zzxgraph commented on GitHub (Aug 9, 2024): (Llama3.1) [zzx@master ~]$ ls -ld /usr /usr/local /usr/local/bin drwxr-xr-x. 14 root root 166 Jul 11 2023 /usr drwxr-xr-x. 23 root root 4096 Jul 31 15:49 /usr/local drwxr-xr-x. 2 root root 172 Aug 7 16:57 /usr/local/bin I encountered this problem when I first installed it and tried uninstalling and reinstalling, but the problem still exists.
Author
Owner

@rick-github commented on GitHub (Aug 9, 2024):

What's the output of ls -l /usr/local/bin/ollama?

In /etc/systemd/system/ollama.service, try replacing

Environment="PATH=/home/zzx/.conda/envs/Llama3.1/bin:/usr/local/anaconda3/condabin:/usr/local/anaconda3/bin:/opt/rh/devtoolset-8/root/usr/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/local/cuda-11.8/bin:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.382.b05-1.el7_9.aarch64/jre/bin:/data/hadoop/app/bin:/data/hadoop/app/sbin:/usr/local/hive/bin:/home/zzx/.local/bin:/home/zzx/bin"

with

Environment="PATH=/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin"

Then

sudo systemctl stop ollama
sudo systemctl daemon-reload
sudo systemctl start ollama
sudo systemctl status ollama
<!-- gh-comment-id:2277338258 --> @rick-github commented on GitHub (Aug 9, 2024): What's the output of `ls -l /usr/local/bin/ollama`? In /etc/systemd/system/ollama.service, try replacing ``` Environment="PATH=/home/zzx/.conda/envs/Llama3.1/bin:/usr/local/anaconda3/condabin:/usr/local/anaconda3/bin:/opt/rh/devtoolset-8/root/usr/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/usr/local/cuda-11.8/bin:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.382.b05-1.el7_9.aarch64/jre/bin:/data/hadoop/app/bin:/data/hadoop/app/sbin:/usr/local/hive/bin:/home/zzx/.local/bin:/home/zzx/bin" ``` with ``` Environment="PATH=/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin" ``` Then ``` sudo systemctl stop ollama sudo systemctl daemon-reload sudo systemctl start ollama sudo systemctl status ollama ```
Author
Owner

@zzxgraph commented on GitHub (Aug 9, 2024):

(Llama3.1) [zzx@master ~]$ ls -l /usr/local/bin/ollama
-rwxr-xr-x 1 root root 461357856 Aug 7 16:57 /usr/local/bin/ollama
(Llama3.1) [zzx@master ~]$ sudo vim /etc/systemd/system/ollama.service
[sudo] password for zzx:
(Llama3.1) [zzx@master ~]$ sudo systemctl stop ollama
Warning: ollama.service changed on disk. Run 'systemctl daemon-reload' to reload units.
(Llama3.1) [zzx@master ~]$ sudo systemctl daemon-reload
(Llama3.1) [zzx@master ~]$ sudo systemctl start ollama
(Llama3.1) [zzx@master ~]$ sudo systemctl status ollama
● ollama.service - Ollama Service
Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Fri 2024-08-09 15:41:52 CST; 2s ago
Process: 20416 ExecStart=/usr/local/bin/ollama serve (code=exited, status=127)
Main PID: 20416 (code=exited, status=127)

Aug 09 15:41:52 master systemd[1]: Unit ollama.service entered failed state.
Aug 09 15:41:52 master systemd[1]: ollama.service failed.

<!-- gh-comment-id:2277357127 --> @zzxgraph commented on GitHub (Aug 9, 2024): (Llama3.1) [zzx@master ~]$ ls -l /usr/local/bin/ollama -rwxr-xr-x 1 root root 461357856 Aug 7 16:57 /usr/local/bin/ollama (Llama3.1) [zzx@master ~]$ sudo vim /etc/systemd/system/ollama.service [sudo] password for zzx: (Llama3.1) [zzx@master ~]$ sudo systemctl stop ollama Warning: ollama.service changed on disk. Run 'systemctl daemon-reload' to reload units. (Llama3.1) [zzx@master ~]$ sudo systemctl daemon-reload (Llama3.1) [zzx@master ~]$ sudo systemctl start ollama (Llama3.1) [zzx@master ~]$ sudo systemctl status ollama ● ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: disabled) Active: activating (auto-restart) (Result: exit-code) since Fri 2024-08-09 15:41:52 CST; 2s ago Process: 20416 ExecStart=/usr/local/bin/ollama serve (code=exited, status=127) Main PID: 20416 (code=exited, status=127) Aug 09 15:41:52 master systemd[1]: Unit ollama.service entered failed state. Aug 09 15:41:52 master systemd[1]: ollama.service failed.
Author
Owner

@rick-github commented on GitHub (Aug 9, 2024):

Well, now I'm grasping. What's the result of these commands:
cat /etc/os-release
systemd --version

<!-- gh-comment-id:2277377902 --> @rick-github commented on GitHub (Aug 9, 2024): Well, now I'm grasping. What's the result of these commands: `cat /etc/os-release` `systemd --version`
Author
Owner

@zzxgraph commented on GitHub (Aug 9, 2024):

(Llama3.1) [zzx@master ~]$ cat /etc/os-release
NAME="CentOS Linux"
VERSION="7 (AltArch)"
ID="centos"
ID_LIKE="rhel fedora"
VERSION_ID="7"
PRETTY_NAME="CentOS Linux 7 (AltArch)"
ANSI_COLOR="0;31"
CPE_NAME="cpe:/o:centos:centos:7:server"
HOME_URL="https://www.centos.org/"
BUG_REPORT_URL="https://bugs.centos.org/"

CENTOS_MANTISBT_PROJECT="CentOS-7"
CENTOS_MANTISBT_PROJECT_VERSION="7"
REDHAT_SUPPORT_PRODUCT="centos"
REDHAT_SUPPORT_PRODUCT_VERSION="7"

(Llama3.1) [zzx@master ~]$ systemd --version
bash: systemd: command not found...

<!-- gh-comment-id:2277385290 --> @zzxgraph commented on GitHub (Aug 9, 2024): (Llama3.1) [zzx@master ~]$ cat /etc/os-release NAME="CentOS Linux" VERSION="7 (AltArch)" ID="centos" ID_LIKE="rhel fedora" VERSION_ID="7" PRETTY_NAME="CentOS Linux 7 (AltArch)" ANSI_COLOR="0;31" CPE_NAME="cpe:/o:centos:centos:7:server" HOME_URL="https://www.centos.org/" BUG_REPORT_URL="https://bugs.centos.org/" CENTOS_MANTISBT_PROJECT="CentOS-7" CENTOS_MANTISBT_PROJECT_VERSION="7" REDHAT_SUPPORT_PRODUCT="centos" REDHAT_SUPPORT_PRODUCT_VERSION="7" (Llama3.1) [zzx@master ~]$ systemd --version bash: systemd: command not found...
Author
Owner

@rick-github commented on GitHub (Aug 9, 2024):

Hmm, you are running an ancient version of CentOS.

What does /usr/lib/systemd/systemd --version show?

<!-- gh-comment-id:2277512244 --> @rick-github commented on GitHub (Aug 9, 2024): Hmm, you are running an ancient version of CentOS. What does `/usr/lib/systemd/systemd --version` show?
Author
Owner

@zzxgraph commented on GitHub (Aug 12, 2024):

(Llama3.1) [zzx@master ~]$ /usr/lib/systemd/systemd --version
systemd 219
+PAM +AUDIT +SELINUX +IMA -APPARMOR +SMACK +SYSVINIT +UTMP +LIBCRYPTSETUP +GCRYPT +GNUTLS +ACL +XZ +LZ4 -SECCOMP +BLKID +ELFUTILS +KMOD +IDN

<!-- gh-comment-id:2282956679 --> @zzxgraph commented on GitHub (Aug 12, 2024): (Llama3.1) [zzx@master ~]$ /usr/lib/systemd/systemd --version systemd 219 +PAM +AUDIT +SELINUX +IMA -APPARMOR +SMACK +SYSVINIT +UTMP +LIBCRYPTSETUP +GCRYPT +GNUTLS +ACL +XZ +LZ4 -SECCOMP +BLKID +ELFUTILS +KMOD +IDN
Author
Owner

@zzxgraph commented on GitHub (Aug 13, 2024):

Hmm, you are running an ancient version of CentOS.

What does /usr/lib/systemd/systemd --version show?

(Llama3.1) [zzx@master ~]$ /usr/lib/systemd/systemd --version
systemd 219
+PAM +AUDIT +SELINUX +IMA -APPARMOR +SMACK +SYSVINIT +UTMP +LIBCRYPTSETUP +GCRYPT +GNUTLS +ACL +XZ +LZ4 -SECCOMP +BLKID +ELFUTILS +KMOD +IDN

What should I do next?

<!-- gh-comment-id:2285635932 --> @zzxgraph commented on GitHub (Aug 13, 2024): > Hmm, you are running an ancient version of CentOS. > > What does `/usr/lib/systemd/systemd --version` show? (Llama3.1) [zzx@master ~]$ /usr/lib/systemd/systemd --version systemd 219 +PAM +AUDIT +SELINUX +IMA -APPARMOR +SMACK +SYSVINIT +UTMP +LIBCRYPTSETUP +GCRYPT +GNUTLS +ACL +XZ +LZ4 -SECCOMP +BLKID +ELFUTILS +KMOD +IDN What should I do next?
Author
Owner

@rick-github commented on GitHub (Aug 13, 2024):

I created a VM and installed CentOS7 + ollama and it worked fine, so I don't have an explanation for the problem you are experiencing. The only difference is the architecture, I will see if I can get an arm64 simulator working but I don't expect any difference. At this point I would normally fallback to tracing the system calls in systemd to see why it can't find the program, but since I can't replicate the issue, I have no solution at the moment.

<!-- gh-comment-id:2286090615 --> @rick-github commented on GitHub (Aug 13, 2024): I created a VM and installed CentOS7 + ollama and it worked fine, so I don't have an explanation for the problem you are experiencing. The only difference is the architecture, I will see if I can get an arm64 simulator working but I don't expect any difference. At this point I would normally fallback to tracing the system calls in `systemd` to see why it can't find the program, but since I can't replicate the issue, I have no solution at the moment.
Author
Owner

@rick-github commented on GitHub (Aug 13, 2024):

Try this: open two terminals, sudo -s in each to get a root shell. In one terminal, stop ollama: systemctl stop ollama. In the other terminal, run strace -p 1 -f -o /tmp/ollama.strace. In the first terminal, start ollama: systemctl start ollama. You should see a bunch of Process attached messages in the second terminal. Wait ten seconds or so, then ^C the strace command. Now you can check the contents of /tmp/ollama.strace for reasons why ollama failed to start. Try this is a starting point: grep -5 execve /tmp/ollama.strace.

<!-- gh-comment-id:2286113248 --> @rick-github commented on GitHub (Aug 13, 2024): Try this: open two terminals, `sudo -s` in each to get a root shell. In one terminal, stop ollama: `systemctl stop ollama`. In the other terminal, run `strace -p 1 -f -o /tmp/ollama.strace`. In the first terminal, start ollama: `systemctl start ollama`. You should see a bunch of `Process attached` messages in the second terminal. Wait ten seconds or so, then ^C the `strace` command. Now you can check the contents of /tmp/ollama.strace for reasons why ollama failed to start. Try this is a starting point: `grep -5 execve /tmp/ollama.strace`.
Author
Owner

@pdevine commented on GitHub (Aug 30, 2024):

I think this is the smoking gun:

● ollama.service - Ollama Service
Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: disabled)
Active: activating (auto-restart) (Result: exit-code) since Fri 2024-08-09 15:41:52 CST; 2s ago
Process: 20416 ExecStart=/usr/local/bin/ollama serve (code=exited, status=127)
Main PID: 20416 (code=exited, status=127)

This is being tracked in #6541

<!-- gh-comment-id:2322523098 --> @pdevine commented on GitHub (Aug 30, 2024): I think this is the smoking gun: ``` ● ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: disabled) Active: activating (auto-restart) (Result: exit-code) since Fri 2024-08-09 15:41:52 CST; 2s ago Process: 20416 ExecStart=/usr/local/bin/ollama serve (code=exited, status=127) Main PID: 20416 (code=exited, status=127) ``` This is being tracked in #6541
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29668