[GH-ISSUE #13535] Can't override OLLAMA_LIBRARY_PATH [Ubuntu 24.04.3 LTS | ollama 0.13.5 systemd | AMD ROCm] #55428

Closed
opened 2026-04-29 09:10:12 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @osirisOfGit on GitHub (Dec 20, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13535

What is the issue?

What it says on the tin - followed instructions to install AMD ROCm native on my system, all lights green on that front -

post-installation steps results
❯ rocm-smi


WARNING: AMD GPU device(s) is/are in a low-power state. Check power control/runtime_status

========================================= ROCm System Management Interface =========================================
=================================================== Concise Info ===================================================
Device  Node  IDs              Temp    Power   Partitions          SCLK   MCLK     Fan  Perf  PwrCap  VRAM%  GPU%  
              (DID,     GUID)  (Edge)  (Avg)   (Mem, Compute, ID)                                                  
====================================================================================================================
0       1     0x7550,   45211  37.0°C  15.0W   N/A, N/A, 0         16Mhz  456Mhz   0%   auto  330.0W  4%     2%    
1       2     0x13c0,   21135  37.0°C  0.015W  N/A, N/A, 0         N/A    2800Mhz  0%   auto  N/A     0%     0%    
====================================================================================================================
=============================================== End of ROCm SMI Log ================================================

~ 
❯ rocminfo | grep -i "Marketing Name:"
  Marketing Name:          AMD Ryzen 7 9800X3D 8-Core Processor
  Marketing Name:          AMD Radeon RX 9070 XT              
  Marketing Name:          AMD Radeon Graphics                

and ollama is added to the requisite groups

❯ id ollama
uid=997(ollama) gid=984(ollama) groups=984(ollama),44(video),992(render)

but all my driver stuff is installed to /opt/rocm/

ls -lart /opt/rocm/
❯ lt /opt/rocm/
total 40
lrwxrwxrwx  1 root root    10 Nov 20 14:52 llvm -> ./lib/llvm
lrwxrwxrwx  1 root root    32 Nov 20 14:53 amdgcn -> lib/llvm/lib/clang/20/lib/amdgcn
drwxr-xr-x  8 root root  4096 Dec 17 20:24 .
drwxr-xr-x 25 root root  4096 Dec 17 20:24 share
drwxr-xr-x 13 root root  4096 Dec 17 20:24 libexec
drwxr-xr-x 49 root root  4096 Dec 17 20:24 include
drwxr-xr-x  5 root root  4096 Dec 17 20:24 ..
drwxr-xr-x  3 root root  4096 Dec 17 20:24 bin
drwxr-xr-x 15 root root 12288 Dec 17 20:26 lib
drwxr-xr-x  2 root root  4096 Dec 20 00:09 .info

not /usr/local/lib/ollama/rocm as Ollama seems to default, causing level=INFO source=runner.go:464 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs=map[] error="runner crashed".

I've tried literally every way of overwriting the OLLAMA_LIBRARY_PATH= (system) environment variable that i can think of, but the result is always the same - the discovery action always seems to use the defaults and doesn't allow me to point it anywhere else

❯ sudo systemctl daemon-reload

~ 
❯ sudo systemctl restart ollama

~ 
❯ sudo systemctl status ollama
● ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: enabled)
    Drop-In: /etc/systemd/system/ollama.service.d
             └─override.conf
     Active: active (running) since Sat 2025-12-20 01:57:18 CST; 1s ago
   Main PID: 31823 (ollama)
      Tasks: 10 (limit: 73894)
     Memory: 10.4M (peak: 360.4M)
        CPU: 586ms
     CGroup: /system.slice/ollama.service
             └─31823 /usr/local/bin/ollama serve

Dec 20 01:57:18 Anubis ollama[31823]: "/usr/local/lib/ollama/rocm/rocblas/library/TensileLibrary_lazy_gfx1200.dat"
Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=INFO source=runner.go:464 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs=map[] error="runner crashed"
Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=TRACE source=runner.go:467 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" devices=[]
Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=DEBUG source=runner.go:437 msg="bootstrap discovery took" duration=616.404417ms OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs=map[]
Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=INFO source=runner.go:106 msg="experimental Vulkan support disabled.  To enable, set OLLAMA_VULKAN=1"
Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=DEBUG source=runner.go:124 msg="evaluating which, if any, devices to filter out" initial_count=0
Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=TRACE source=runner.go:174 msg="supported GPU library combinations before filtering" supported=map[]
Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=DEBUG source=runner.go:40 msg="GPU bootstrap discovery took" duration=644.461991ms
Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="60.4 GiB" available="55.5 GiB"
Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=INFO source=routes.go:1648 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB"

Below image is just proof that i've tried the most sensible way and the rest of the environment variables work, just not this one:

Ultrawide image screenshot

(Ignore the S[Service] in this, that was a fat finger as i was taking the screenshot, i was actually using [Service]

Image

Thankfully, Vulkan does appear to work:

Dec 20 02:11:34 Anubis ollama[33593]: time=2025-12-20T02:11:34.424-06:00 level=INFO source=types.go:42 msg="inference compute" id=00000000-7c00-0000-0000-000000000000 filter_id="" library=Vulkan compute=0.0 name=Vulkan1 description="AMD Radeon Graphics (RADV RAPHAEL_MENDOCINO)" libdirs=ollama,vulkan driver=0.0 pci_id=0000:7c:00.0 type=iGPU total="32.2 GiB" available="32.2 GiB"
Dec 20 02:11:34 Anubis ollama[33593]: time=2025-12-20T02:11:34.424-06:00 level=INFO source=types.go:42 msg="inference compute" id=00000000-0300-0000-0000-000000000000 filter_id="" library=Vulkan compute=0.0 name=Vulkan0 description="AMD Radeon RX 9070 XT (RADV GFX1201)" libdirs=ollama,vulkan driver=0.0 pci_id=0000:03:00.0 type=discrete total="15.9 GiB" available="15.2 GiB"

but i figured i'd report it anyway.

Let me know if there's any more info i can provide or test out. Thanks in advance!

Full journal output, reversed (new -> old), includes vulkan
ollama.log

Relevant log output


OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.13.5

Originally created by @osirisOfGit on GitHub (Dec 20, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13535 ### What is the issue? What it says on the tin - followed instructions [to install AMD ROCm native on my system](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/install/install-methods/package-manager/package-manager-ubuntu.html), all lights green on that front - <details> <summary> post-installation steps results </summary> ```bash ❯ rocm-smi WARNING: AMD GPU device(s) is/are in a low-power state. Check power control/runtime_status ========================================= ROCm System Management Interface ========================================= =================================================== Concise Info =================================================== Device Node IDs Temp Power Partitions SCLK MCLK Fan Perf PwrCap VRAM% GPU% (DID, GUID) (Edge) (Avg) (Mem, Compute, ID) ==================================================================================================================== 0 1 0x7550, 45211 37.0°C 15.0W N/A, N/A, 0 16Mhz 456Mhz 0% auto 330.0W 4% 2% 1 2 0x13c0, 21135 37.0°C 0.015W N/A, N/A, 0 N/A 2800Mhz 0% auto N/A 0% 0% ==================================================================================================================== =============================================== End of ROCm SMI Log ================================================ ~ ❯ rocminfo | grep -i "Marketing Name:" Marketing Name: AMD Ryzen 7 9800X3D 8-Core Processor Marketing Name: AMD Radeon RX 9070 XT Marketing Name: AMD Radeon Graphics ``` </details> and ollama is added to the requisite groups ```bash ❯ id ollama uid=997(ollama) gid=984(ollama) groups=984(ollama),44(video),992(render) ``` but all my driver stuff is installed to `/opt/rocm/` <details> <summary> ls -lart /opt/rocm/ </summary> ```bash ❯ lt /opt/rocm/ total 40 lrwxrwxrwx 1 root root 10 Nov 20 14:52 llvm -> ./lib/llvm lrwxrwxrwx 1 root root 32 Nov 20 14:53 amdgcn -> lib/llvm/lib/clang/20/lib/amdgcn drwxr-xr-x 8 root root 4096 Dec 17 20:24 . drwxr-xr-x 25 root root 4096 Dec 17 20:24 share drwxr-xr-x 13 root root 4096 Dec 17 20:24 libexec drwxr-xr-x 49 root root 4096 Dec 17 20:24 include drwxr-xr-x 5 root root 4096 Dec 17 20:24 .. drwxr-xr-x 3 root root 4096 Dec 17 20:24 bin drwxr-xr-x 15 root root 12288 Dec 17 20:26 lib drwxr-xr-x 2 root root 4096 Dec 20 00:09 .info ``` </details> not `/usr/local/lib/ollama/rocm` as Ollama seems to default, causing `level=INFO source=runner.go:464 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs=map[] error="runner crashed"`. I've tried _literally_ every way of overwriting the `OLLAMA_LIBRARY_PATH=` (system) environment variable that i can think of, but the result is always the same - the discovery action always seems to use the defaults and doesn't allow me to point it anywhere else ```bash ❯ sudo systemctl daemon-reload ~ ❯ sudo systemctl restart ollama ~ ❯ sudo systemctl status ollama ● ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: enabled) Drop-In: /etc/systemd/system/ollama.service.d └─override.conf Active: active (running) since Sat 2025-12-20 01:57:18 CST; 1s ago Main PID: 31823 (ollama) Tasks: 10 (limit: 73894) Memory: 10.4M (peak: 360.4M) CPU: 586ms CGroup: /system.slice/ollama.service └─31823 /usr/local/bin/ollama serve Dec 20 01:57:18 Anubis ollama[31823]: "/usr/local/lib/ollama/rocm/rocblas/library/TensileLibrary_lazy_gfx1200.dat" Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=INFO source=runner.go:464 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs=map[] error="runner crashed" Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=TRACE source=runner.go:467 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" devices=[] Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=DEBUG source=runner.go:437 msg="bootstrap discovery took" duration=616.404417ms OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs=map[] Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=INFO source=runner.go:106 msg="experimental Vulkan support disabled. To enable, set OLLAMA_VULKAN=1" Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=DEBUG source=runner.go:124 msg="evaluating which, if any, devices to filter out" initial_count=0 Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=TRACE source=runner.go:174 msg="supported GPU library combinations before filtering" supported=map[] Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=DEBUG source=runner.go:40 msg="GPU bootstrap discovery took" duration=644.461991ms Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="60.4 GiB" available="55.5 GiB" Dec 20 01:57:18 Anubis ollama[31823]: time=2025-12-20T01:57:18.826-06:00 level=INFO source=routes.go:1648 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB" ``` Below image is just proof that i've tried the most sensible way and the rest of the environment variables work, just not this one: <details> <summary> Ultrawide image screenshot </summary> (Ignore the `S[Service]` in this, that was a fat finger as i was taking the screenshot, i was actually using `[Service]` <img width="2625" height="1208" alt="Image" src="https://github.com/user-attachments/assets/a029176e-c17a-446e-a6a9-d5017b790d5b" /> </details> Thankfully, Vulkan does appear to work: ```bash Dec 20 02:11:34 Anubis ollama[33593]: time=2025-12-20T02:11:34.424-06:00 level=INFO source=types.go:42 msg="inference compute" id=00000000-7c00-0000-0000-000000000000 filter_id="" library=Vulkan compute=0.0 name=Vulkan1 description="AMD Radeon Graphics (RADV RAPHAEL_MENDOCINO)" libdirs=ollama,vulkan driver=0.0 pci_id=0000:7c:00.0 type=iGPU total="32.2 GiB" available="32.2 GiB" Dec 20 02:11:34 Anubis ollama[33593]: time=2025-12-20T02:11:34.424-06:00 level=INFO source=types.go:42 msg="inference compute" id=00000000-0300-0000-0000-000000000000 filter_id="" library=Vulkan compute=0.0 name=Vulkan0 description="AMD Radeon RX 9070 XT (RADV GFX1201)" libdirs=ollama,vulkan driver=0.0 pci_id=0000:03:00.0 type=discrete total="15.9 GiB" available="15.2 GiB" ``` but i figured i'd report it anyway. Let me know if there's any more info i can provide or test out. Thanks in advance! Full journal output, reversed (new -> old), includes vulkan [ollama.log](https://github.com/user-attachments/files/24270557/ollama.log) ### Relevant log output ```shell ``` ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.13.5
GiteaMirror added the bug label 2026-04-29 09:10:12 -05:00
Author
Owner

@rick-github commented on GitHub (Dec 20, 2025):

#12908

<!-- gh-comment-id:3677772817 --> @rick-github commented on GitHub (Dec 20, 2025): #12908
Author
Owner

@osirisOfGit commented on GitHub (Dec 20, 2025):

Ah, it turns out, it wasn't a problem with the library path - Ollama was just insisting that it use my iGPU instead of my discrete graphics card. I finally got it to only look at my Radeon discrete with:

### Editing /etc/systemd/system/ollama.service.d/override.conf
### Anything between here and the comment below will become the contents of the drop-in file

[Service]
# Tell GGML to use ROCm (not CUDA) and set the visible device.
Environment="GGML_USE_ROCM=1"
Environment="GGML_USE_CUDA=0"


# Ollama‑specific helpers
Environment="OLLAMA_ROCM=1"
Environment="OLLAMA_DEBUG=0"
Environment="OLLAMA_VULKAN=0"
Environment="ROCR_VISIBLE_DEVICES=0"
Environment="ROCM_VISIBLE_DEVICES=0"

specifically

Environment="ROCR_VISIBLE_DEVICES=0"
Environment="ROCM_VISIBLE_DEVICES=0"

(the appropriate device id) - it took both of these for it to finally work - disregard this bug!

<!-- gh-comment-id:3678052640 --> @osirisOfGit commented on GitHub (Dec 20, 2025): Ah, it turns out, it wasn't a problem with the library path - Ollama was just _insisting_ that it use my iGPU instead of my discrete graphics card. I finally got it to only look at my Radeon discrete with: ```ini ### Editing /etc/systemd/system/ollama.service.d/override.conf ### Anything between here and the comment below will become the contents of the drop-in file [Service] # Tell GGML to use ROCm (not CUDA) and set the visible device. Environment="GGML_USE_ROCM=1" Environment="GGML_USE_CUDA=0" # Ollama‑specific helpers Environment="OLLAMA_ROCM=1" Environment="OLLAMA_DEBUG=0" Environment="OLLAMA_VULKAN=0" Environment="ROCR_VISIBLE_DEVICES=0" Environment="ROCM_VISIBLE_DEVICES=0" ``` specifically ``` Environment="ROCR_VISIBLE_DEVICES=0" Environment="ROCM_VISIBLE_DEVICES=0" ``` (the appropriate device id) - it took both of these for it to finally work - disregard this bug!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#55428