[GH-ISSUE #14874] GPU is not used, while using as system service (systemd) #35350

Closed
opened 2026-04-22 19:47:39 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @schoenid on GitHub (Mar 16, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14874

What is the issue?

Using Ollama I found, that Ollama uses only the CPU instead of the GPU.

My system:
Operating System: Ubuntu Studio 24.04
KDE Plasma Version: 5.27.12
KDE Frameworks Version: 5.115.0
Qt Version: 5.15.13
Kernel Version: 6.17.0-19-generic (64-bit)
Graphics Platform: X11
Processors: 16 × 13th Gen Intel® Core™ i7-13700K
Memory: 62.5 GiB of RAM
Graphics Processor: NVIDIA GeForce RTX 4060 Ti/PCIe/SSE2
VRAM: 16GiB
Manufacturer: ASUS

Relevant log output

none.

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.18.0

Originally created by @schoenid on GitHub (Mar 16, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14874 ### What is the issue? Using Ollama I found, that Ollama uses only the CPU instead of the GPU. My system: Operating System: Ubuntu Studio 24.04 KDE Plasma Version: 5.27.12 KDE Frameworks Version: 5.115.0 Qt Version: 5.15.13 Kernel Version: 6.17.0-19-generic (64-bit) Graphics Platform: X11 Processors: 16 × 13th Gen Intel® Core™ i7-13700K Memory: 62.5 GiB of RAM Graphics Processor: NVIDIA GeForce RTX 4060 Ti/PCIe/SSE2 VRAM: 16GiB Manufacturer: ASUS ### Relevant log output ```shell none. ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.18.0
GiteaMirror added the bug label 2026-04-22 19:47:39 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 16, 2026):

Server logs will aid in debugging.

<!-- gh-comment-id:4067155632 --> @rick-github commented on GitHub (Mar 16, 2026): [Server logs](https://docs.ollama.com/troubleshooting) will aid in debugging.
Author
Owner

@schoenid commented on GitHub (Mar 16, 2026):

I found, that /etc/systemd/system/ollama.service contains:

[Unit]
After=network-online.target

This activates ollama without the GPU being ready. So it uses the fallback to the CPU.

Using:

[Unit]
After=graphical.target

solves the problem.
But after an update the settings are reset to After=network-online.target.

<!-- gh-comment-id:4067178950 --> @schoenid commented on GitHub (Mar 16, 2026): I found, that `/etc/systemd/system/ollama.service` contains: ``` [Unit] After=network-online.target ``` This activates ollama without the GPU being ready. So it uses the fallback to the CPU. Using: ``` [Unit] After=graphical.target ``` solves the problem. But after an update the settings are reset to `After=network-online.target`.
Author
Owner

@rick-github commented on GitHub (Mar 16, 2026):

Run sudo systemctl edit ollama and add the following after ### Anything between here and the comment below will become the new contents of the file:

[Unit]
After=graphical.target
<!-- gh-comment-id:4067221602 --> @rick-github commented on GitHub (Mar 16, 2026): Run `sudo systemctl edit ollama` and add the following after `### Anything between here and the comment below will become the new contents of the file`: ``` [Unit] After=graphical.target ```
Author
Owner

@schoenid commented on GitHub (Mar 16, 2026):

Thanks. Works great!
It would be nice, to have this in the install instructions.

<!-- gh-comment-id:4067361746 --> @schoenid commented on GitHub (Mar 16, 2026): Thanks. Works great! It would be nice, to have this in the install instructions.
Author
Owner

@schoenid commented on GitHub (Mar 16, 2026):

Server logs will aid in debugging.

cat: /home/user/.ollama/logs/server.log: Datei oder Verzeichnis nicht gefunden

Ollama has been installed at: /usr/local/lib/ollama/
No idea, where the logs are. (Not at: "/usr/local/lib/ollama/logs" or "/var/log/")

<!-- gh-comment-id:4067464965 --> @schoenid commented on GitHub (Mar 16, 2026): > [Server logs](https://docs.ollama.com/troubleshooting) will aid in debugging. `cat: /home/user/.ollama/logs/server.log: Datei oder Verzeichnis nicht gefunden` Ollama has been installed at: `/usr/local/lib/ollama/` No idea, where the logs are. (Not at: "/usr/local/lib/ollama/logs" or "/var/log/")
Author
Owner

@rick-github commented on GitHub (Mar 16, 2026):

It would be nice, to have this in the install instructions.

It's not in the install instructions because this is the first time it's been reported as an issue. The default install works for many users.

No idea, where the logs are. (Not at: "/usr/local/lib/ollama/logs" or "/var/log/")

Follow the instructions after On Linux systems with systemd, the logs can be found with this command:.

<!-- gh-comment-id:4068318667 --> @rick-github commented on GitHub (Mar 16, 2026): > It would be nice, to have this in the install instructions. It's not in the install instructions because this is the first time it's been reported as an issue. The default install works for many users. > No idea, where the logs are. (Not at: "/usr/local/lib/ollama/logs" or "/var/log/") Follow the instructions after `On Linux systems with systemd, the logs can be found with this command:`.
Author
Owner

@schoenid commented on GitHub (Mar 16, 2026):

Thanks a lot!

<!-- gh-comment-id:4069632902 --> @schoenid commented on GitHub (Mar 16, 2026): Thanks a lot!
Author
Owner

@BillionClaw commented on GitHub (Mar 17, 2026):

Looking into this — checking systemd service configuration and GPU detection. Happy to submit a fix.

<!-- gh-comment-id:4071876615 --> @BillionClaw commented on GitHub (Mar 17, 2026): Looking into this — checking systemd service configuration and GPU detection. Happy to submit a fix.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#35350