[GH-ISSUE #6520] environmental variable not passed to service #29867

Closed
opened 2026-04-22 09:08:32 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @N4S4 on GitHub (Aug 26, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6520

What is the issue?

when trying to set Environmental variable using systemctrl edit ollama.service, restart daemon and restarting ollama.service, the override is not passed to the service. trying to override HSA_OVERRIDE_GFX_VERSION to make igpu 780M ryzen 9 7940hs

OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.3.6

Originally created by @N4S4 on GitHub (Aug 26, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6520 ### What is the issue? when trying to set Environmental variable using systemctrl edit ollama.service, restart daemon and restarting ollama.service, the override is not passed to the service. trying to override HSA_OVERRIDE_GFX_VERSION to make igpu 780M ryzen 9 7940hs ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.3.6
GiteaMirror added the bug label 2026-04-22 09:08:33 -05:00
Author
Owner

@mxyng commented on GitHub (Aug 26, 2024):

Can you include the output of systemctl cat ollama.service?

<!-- gh-comment-id:2310827010 --> @mxyng commented on GitHub (Aug 26, 2024): Can you include the output of `systemctl cat ollama.service`?
Author
Owner

@N4S4 commented on GitHub (Aug 26, 2024):

hi tnx for the response,

# /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"

[Install]
WantedBy=default.target

# /etc/systemd/system/ollama.service.d/override.conf
[Service]
Environment="HSA_OVERRIDE_GFX_VERSION=11.0.0"
```
<!-- gh-comment-id:2310846000 --> @N4S4 commented on GitHub (Aug 26, 2024): hi tnx for the response, ```` # /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin" [Install] WantedBy=default.target # /etc/systemd/system/ollama.service.d/override.conf [Service] Environment="HSA_OVERRIDE_GFX_VERSION=11.0.0" ```
Author
Owner

@N4S4 commented on GitHub (Aug 26, 2024):

not sure what happen but now seems it was passed changed to 11.0.2 than back to 11.0.0

2024/08/26 21:05:04 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:11.0.2 OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:

now i have rocm installed but still not using my gpu

journalctl -u ollama.service | grep gpu output:

WARN [server_params_parse] Not compiled with GPU offload support, --n-gpu-layers option will be ignored. See main README.md for information on enabling GPU BLAS support | n_gpu_layers=-1 tid="129784492963712" timestamp=1724699112

set DEBUG

INFO source=amd_linux.go:274 msg="unsupported Radeon iGPU detected skipping" id=0 total="512.0 MiB"

not sure why is skipping my igpu any idea?

<!-- gh-comment-id:2310888297 --> @N4S4 commented on GitHub (Aug 26, 2024): not sure what happen but now seems it was passed changed to 11.0.2 than back to 11.0.0 `2024/08/26 21:05:04 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:11.0.2 OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:` now i have rocm installed but still not using my gpu journalctl -u ollama.service | grep gpu output: ` WARN [server_params_parse] Not compiled with GPU offload support, --n-gpu-layers option will be ignored. See main README.md for information on enabling GPU BLAS support | n_gpu_layers=-1 tid="129784492963712" timestamp=1724699112` set DEBUG `INFO source=amd_linux.go:274 msg="unsupported Radeon iGPU detected skipping" id=0 total="512.0 MiB" ` not sure why is skipping my igpu any idea?
Author
Owner

@N4S4 commented on GitHub (Aug 26, 2024):

solved increasing vram from bios thank you

<!-- gh-comment-id:2311064509 --> @N4S4 commented on GitHub (Aug 26, 2024): solved increasing vram from bios thank you
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29867