[GH-ISSUE #11565] ollama 0.9.6 can't not run as service but ollama serve work well. #54146

Open
opened 2026-04-29 05:16:44 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @sohochaser on GitHub (Jul 29, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11565

What is the issue?

ollama serve work well but will exit after a while. But it can not start as service using : systemctl enable ollama....

Relevant log output

here is env:

# /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
#use_gpu = true
#gpu_devices = 0,1,2
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"
Environment="OLLAMA_MODELS=/mnt/extdisk/ollama/.ollama/models"

[Install]
WantedBy=multi-user.target


(base) [root@pm-78bf ~]# ls -ltr /var/lib/|grep ollama
drwxr-xr-x   2 ollama         ollama            6 Jul 28 22:35 ollama




ollama serve work well but will exit after a while (may be one hour later)....

Jul 28 20:52:04 omnibook-ultra-14 systemd[1]: Started Ollama Service.
░░ Subject: A start job for unit ollama.service has finished successfully
░░ Defined-By: systemd
░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel
░░
░░ A start job for unit ollama.service has finished successfully.
░░
░░ The job identifier is 26095.
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: Couldn't find '/var/lib/ollama/.ollama/id_ed25519'. Generating new private key.
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: Your new public key is:
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHantvXTIoDqpGd7Ahu3Ccxc6FyyYYf63AA3X2sScq5h
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.657-04:00 level=INFO source=routes.go:1235 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDIN>
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.667-04:00 level=INFO source=images.go:476 msg="total blobs: 0"
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.667-04:00 level=INFO source=images.go:483 msg="total unused blobs removed: 0"
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.667-04:00 level=INFO source=routes.go:1288 msg="Listening on 127.0.0.1:11434 (version 0.9.6)"
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.667-04:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.697-04:00 level=WARN source=amd_linux.go:61 msg="ollama recommends running the https://www.amd.com/en/support/>
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.698-04:00 level=INFO source=amd_linux.go:296 msg="unsupported Radeon iGPU detected skipping" id=0 total="512.0>
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.698-04:00 level=INFO source=amd_linux.go:402 msg="no compatible amdgpu devices detected"
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.698-04:00 level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered"
Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.698-04:00 level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compute="" driver>

OS

No response

GPU

No response

CPU

No response

Ollama version

0.9.6

Originally created by @sohochaser on GitHub (Jul 29, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11565 ### What is the issue? ollama serve work well but will exit after a while. But it can not start as service using : systemctl enable ollama.... ### Relevant log output ```shell here is env: # /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] #use_gpu = true #gpu_devices = 0,1,2 ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=$PATH" Environment="OLLAMA_MODELS=/mnt/extdisk/ollama/.ollama/models" [Install] WantedBy=multi-user.target (base) [root@pm-78bf ~]# ls -ltr /var/lib/|grep ollama drwxr-xr-x 2 ollama ollama 6 Jul 28 22:35 ollama ollama serve work well but will exit after a while (may be one hour later).... Jul 28 20:52:04 omnibook-ultra-14 systemd[1]: Started Ollama Service. ░░ Subject: A start job for unit ollama.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://lists.freedesktop.org/mailman/listinfo/systemd-devel ░░ ░░ A start job for unit ollama.service has finished successfully. ░░ ░░ The job identifier is 26095. Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: Couldn't find '/var/lib/ollama/.ollama/id_ed25519'. Generating new private key. Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: Your new public key is: Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHantvXTIoDqpGd7Ahu3Ccxc6FyyYYf63AA3X2sScq5h Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.657-04:00 level=INFO source=routes.go:1235 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDIN> Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.667-04:00 level=INFO source=images.go:476 msg="total blobs: 0" Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.667-04:00 level=INFO source=images.go:483 msg="total unused blobs removed: 0" Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.667-04:00 level=INFO source=routes.go:1288 msg="Listening on 127.0.0.1:11434 (version 0.9.6)" Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.667-04:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs" Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.697-04:00 level=WARN source=amd_linux.go:61 msg="ollama recommends running the https://www.amd.com/en/support/> Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.698-04:00 level=INFO source=amd_linux.go:296 msg="unsupported Radeon iGPU detected skipping" id=0 total="512.0> Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.698-04:00 level=INFO source=amd_linux.go:402 msg="no compatible amdgpu devices detected" Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.698-04:00 level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered" Jul 28 20:52:04 omnibook-ultra-14 ollama[103703]: time=2025-07-28T20:52:04.698-04:00 level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compute="" driver> ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.9.6
GiteaMirror added the bug label 2026-04-29 05:16:44 -05:00
Author
Owner

@rick-github commented on GitHub (Jul 29, 2025):

This looks normal. What's the output of systemctl status ollama --no-pager when the server exits after an hour?

<!-- gh-comment-id:3130651281 --> @rick-github commented on GitHub (Jul 29, 2025): This looks normal. What's the output of `systemctl status ollama --no-pager` when the server exits after an hour?
Author
Owner

@sohochaser commented on GitHub (Jul 29, 2025):

This looks normal. What's the output of systemctl status ollama --no-pager when the server exits after an hour?

thank you, here is the output. I there any other logs I can use to investigate the issue?

<!-- gh-comment-id:3130683434 --> @sohochaser commented on GitHub (Jul 29, 2025): > This looks normal. What's the output of `systemctl status ollama --no-pager` when the server exits after an hour? thank you, here is the output. I there any other logs I can use to investigate the issue?
Author
Owner

@rick-github commented on GitHub (Jul 29, 2025):

thank you, here is the output.

Where?

I there any other logs I can use to investigate the issue?

journalctl -u ollama --no-pager

<!-- gh-comment-id:3130751764 --> @rick-github commented on GitHub (Jul 29, 2025): > thank you, here is the output. Where? > I there any other logs I can use to investigate the issue? `journalctl -u ollama --no-pager`
Author
Owner

@sohochaser commented on GitHub (Jul 29, 2025):

thank you, here is the output.

Where?

I there any other logs I can use to investigate the issue?

journalctl -u ollama --no-pager

something like this:
Jul 29 01:10:12 pm-78bf systemd[1]: Stopped Ollama Service.
Jul 29 01:10:13 pm-78bf systemd[1]: Started Ollama Service.
Jul 29 01:10:13 pm-78bf systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
Jul 29 01:10:13 pm-78bf systemd[1]: ollama.service: Failed with result 'exit-code'.
Jul 29 01:10:16 pm-78bf systemd[1]: ollama.service: Service RestartSec=3s expired, scheduling restart.
Jul 29 01:10:16 pm-78bf systemd[1]: ollama.service: Scheduled restart job, restart counter is at 3166.
Jul 29 01:10:16 pm-78bf systemd[1]: Stopped Ollama Service.
Jul 29 01:10:16 pm-78bf systemd[1]: Started Ollama Service.
Jul 29 01:10:16 pm-78bf systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
Jul 29 01:10:16 pm-78bf systemd[1]: ollama.service: Failed with result 'exit-code'.
Jul 29 01:10:19 pm-78bf systemd[1]: ollama.service: Service RestartSec=3s expired, scheduling restart.
Jul 29 01:10:19 pm-78bf systemd[1]: ollama.service: Scheduled restart job, restart counter is at 3167.
Jul 29 01:10:19 pm-78bf systemd[1]: Stopped Ollama Service.

<!-- gh-comment-id:3131578730 --> @sohochaser commented on GitHub (Jul 29, 2025): > > thank you, here is the output. > > Where? > > > I there any other logs I can use to investigate the issue? > > `journalctl -u ollama --no-pager` something like this: Jul 29 01:10:12 pm-78bf systemd[1]: Stopped Ollama Service. Jul 29 01:10:13 pm-78bf systemd[1]: Started Ollama Service. Jul 29 01:10:13 pm-78bf systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE Jul 29 01:10:13 pm-78bf systemd[1]: ollama.service: Failed with result 'exit-code'. Jul 29 01:10:16 pm-78bf systemd[1]: ollama.service: Service RestartSec=3s expired, scheduling restart. Jul 29 01:10:16 pm-78bf systemd[1]: ollama.service: Scheduled restart job, restart counter is at 3166. Jul 29 01:10:16 pm-78bf systemd[1]: Stopped Ollama Service. Jul 29 01:10:16 pm-78bf systemd[1]: Started Ollama Service. Jul 29 01:10:16 pm-78bf systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE Jul 29 01:10:16 pm-78bf systemd[1]: ollama.service: Failed with result 'exit-code'. Jul 29 01:10:19 pm-78bf systemd[1]: ollama.service: Service RestartSec=3s expired, scheduling restart. Jul 29 01:10:19 pm-78bf systemd[1]: ollama.service: Scheduled restart job, restart counter is at 3167. Jul 29 01:10:19 pm-78bf systemd[1]: Stopped Ollama Service.
Author
Owner

@rick-github commented on GitHub (Jul 29, 2025):

Output of

which ollama
ollama -v
<!-- gh-comment-id:3131595150 --> @rick-github commented on GitHub (Jul 29, 2025): Output of ``` which ollama ollama -v ```
Author
Owner

@sohochaser commented on GitHub (Jul 29, 2025):

Output of

which ollama
ollama -v

/usr/local/bin/ollama

(base) [root@pm-78bf ~]# ollama -v
ollama version is 0.9.6

(base) [root@pm-78bf ~]# ls -ltr /var/lib|grep ollama
drwxr-xr-x 2 ollama ollama 6 Jul 28 22:35 ollama

<!-- gh-comment-id:3131730288 --> @sohochaser commented on GitHub (Jul 29, 2025): > Output of > > ``` > which ollama > ollama -v > ``` /usr/local/bin/ollama (base) [root@pm-78bf ~]# ollama -v ollama version is 0.9.6 (base) [root@pm-78bf ~]# ls -ltr /var/lib|grep ollama drwxr-xr-x 2 ollama ollama 6 Jul 28 22:35 ollama
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#54146