[GH-ISSUE #12903] Unable to use my AMD GPU with ollama on Ubuntu #55065

Closed
opened 2026-04-29 08:16:11 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @malay2bfriend on GitHub (Nov 1, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12903

What is the issue?

I am new to running ollama on local computer. I have Ubuntu 22.04.5 LTS. I am trying to use my two AMD graphics card.
Can some one help me to utilize my AMD graphics card?

Relevant log output

Log 1:

journalctl -u ollama.service -n 20 --no-pager
Nov 01 05:47:55 ubuntu-dev-m systemd[1]: Stopping Ollama Service...
Nov 01 05:47:55 ubuntu-dev-m systemd[1]: ollama.service: Deactivated successfully.
Nov 01 05:47:55 ubuntu-dev-m systemd[1]: Stopped Ollama Service.
Nov 01 05:47:55 ubuntu-dev-m systemd[1]: ollama.service: Consumed 1.736s CPU time.
-- Boot bc6ec016c76f493a9ad5e1d74df3b7e2 --
Nov 01 06:21:09 ubuntu-dev-m systemd[1]: Started Ollama Service.
Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.046-04:00 level=INFO source=routes.go:1524 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:INFO OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:1 http_proxy: https_proxy: no_proxy:]"
Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.048-04:00 level=INFO source=images.go:522 msg="total blobs: 0"
Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.048-04:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0"
Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.049-04:00 level=INFO source=routes.go:1577 msg="Listening on 127.0.0.1:11434 (version 0.12.8)"
Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.051-04:00 level=INFO source=runner.go:76 msg="discovering available GPUs..."
Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.055-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 43111"
Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.109-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 35973"
Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.148-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 37421"
Nov 01 06:21:12 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:12.450-04:00 level=INFO source=runner.go:498 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs=map[] error="runner crashed"
Nov 01 06:21:12 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:12.450-04:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="7.7 GiB" available="6.1 GiB"
Nov 01 06:21:12 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:12.450-04:00 level=INFO source=routes.go:1618 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB"
Nov 01 15:21:30 ubuntu-dev-m systemd[1]: Stopping Ollama Service...
Nov 01 15:21:30 ubuntu-dev-m systemd[1]: ollama.service: Deactivated successfully.
Nov 01 15:21:30 ubuntu-dev-m systemd[1]: Stopped Ollama Service.
Nov 01 15:21:30 ubuntu-dev-m systemd[1]: ollama.service: Consumed 2.624s CPU time.


Log 2:
OLLAMA_DEBUG=1 ollama serve
time=2025-11-01T15:21:31.703-04:00 level=INFO source=routes.go:1524 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/malay2bfriend/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2025-11-01T15:21:31.704-04:00 level=INFO source=images.go:522 msg="total blobs: 0"
time=2025-11-01T15:21:31.704-04:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0"
time=2025-11-01T15:21:31.704-04:00 level=INFO source=routes.go:1577 msg="Listening on 127.0.0.1:11434 (version 0.12.8)"
time=2025-11-01T15:21:31.704-04:00 level=DEBUG source=sched.go:120 msg="starting llm scheduler"
time=2025-11-01T15:21:31.704-04:00 level=INFO source=runner.go:76 msg="discovering available GPUs..."
time=2025-11-01T15:21:31.705-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 46331"
time=2025-11-01T15:21:31.705-04:00 level=DEBUG source=server.go:401 msg=subprocess OLLAMA_DEBUG=1 PATH=/home/malay2bfriend/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin:/home/malay2bfriend/.dotnet/tools LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/cuda_v13 OLLAMA_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/cuda_v13
time=2025-11-01T15:21:31.738-04:00 level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=33.277122ms OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/cuda_v13]" extra_envs=map[]
time=2025-11-01T15:21:31.738-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 33055"
time=2025-11-01T15:21:31.738-04:00 level=DEBUG source=server.go:401 msg=subprocess OLLAMA_DEBUG=1 PATH=/home/malay2bfriend/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin:/home/malay2bfriend/.dotnet/tools LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm
time=2025-11-01T15:21:33.327-04:00 level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=1.589538316s OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs=map[]
time=2025-11-01T15:21:33.328-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 36055"
time=2025-11-01T15:21:33.328-04:00 level=DEBUG source=server.go:401 msg=subprocess OLLAMA_DEBUG=1 PATH=/home/malay2bfriend/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin:/home/malay2bfriend/.dotnet/tools LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/cuda_v12 OLLAMA_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/cuda_v12
time=2025-11-01T15:21:33.360-04:00 level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=33.026582ms OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/cuda_v12]" extra_envs=map[]
time=2025-11-01T15:21:33.360-04:00 level=DEBUG source=runner.go:120 msg="evluating which if any devices to filter out" initial_count=2
time=2025-11-01T15:21:33.360-04:00 level=DEBUG source=runner.go:132 msg="verifying GPU is supported" library=/usr/local/lib/ollama/rocm description="AMD Radeon RX 6600 XT" compute=gfx1032 id=0 pci_id=0000:03:00.0
time=2025-11-01T15:21:33.361-04:00 level=DEBUG source=runner.go:132 msg="verifying GPU is supported" library=/usr/local/lib/ollama/rocm description="AMD Radeon RX 6700 XT" compute=gfx1031 id=1 pci_id=0000:06:00.0
time=2025-11-01T15:21:33.361-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 40599"
time=2025-11-01T15:21:33.361-04:00 level=DEBUG source=server.go:401 msg=subprocess OLLAMA_DEBUG=1 PATH=/home/malay2bfriend/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin:/home/malay2bfriend/.dotnet/tools LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm ROCR_VISIBLE_DEVICES=0 GGML_CUDA_INIT=1
time=2025-11-01T15:21:33.361-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 32795"
time=2025-11-01T15:21:33.361-04:00 level=DEBUG source=server.go:401 msg=subprocess OLLAMA_DEBUG=1 PATH=/home/malay2bfriend/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin:/home/malay2bfriend/.dotnet/tools LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm GGML_CUDA_INIT=1 ROCR_VISIBLE_DEVICES=1
time=2025-11-01T15:21:35.183-04:00 level=INFO source=runner.go:498 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:0]" error="runner crashed"
time=2025-11-01T15:21:35.183-04:00 level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=1.822625782s OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:0]"
time=2025-11-01T15:21:35.183-04:00 level=DEBUG source=runner.go:158 msg="filtering device which didn't fully initialize" id=0 libdir=/usr/local/lib/ollama/rocm pci_id=0000:03:00.0 library=ROCm
time=2025-11-01T15:21:35.201-04:00 level=INFO source=runner.go:498 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:1]" error="runner crashed"
time=2025-11-01T15:21:35.202-04:00 level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=1.840962743s OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:1]"
time=2025-11-01T15:21:35.202-04:00 level=DEBUG source=runner.go:158 msg="filtering device which didn't fully initialize" id=1 libdir=/usr/local/lib/ollama/rocm pci_id=0000:06:00.0 library=ROCm
time=2025-11-01T15:21:35.202-04:00 level=DEBUG source=runner.go:41 msg="GPU bootstrap discovery took" duration=3.497477921s
time=2025-11-01T15:21:35.202-04:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="7.7 GiB" available="4.8 GiB"
time=2025-11-01T15:21:35.202-04:00 level=INFO source=routes.go:1618 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB"


**** Here you can see failure during GPU discovery. 

Let me know if you need any other log from my computer.

OS

Linux

GPU

AMD

CPU

Intel

Ollama version

0.12.8

Originally created by @malay2bfriend on GitHub (Nov 1, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12903 ### What is the issue? I am new to running ollama on local computer. I have Ubuntu 22.04.5 LTS. I am trying to use my two AMD graphics card. Can some one help me to utilize my AMD graphics card? ### Relevant log output ```shell Log 1: journalctl -u ollama.service -n 20 --no-pager Nov 01 05:47:55 ubuntu-dev-m systemd[1]: Stopping Ollama Service... Nov 01 05:47:55 ubuntu-dev-m systemd[1]: ollama.service: Deactivated successfully. Nov 01 05:47:55 ubuntu-dev-m systemd[1]: Stopped Ollama Service. Nov 01 05:47:55 ubuntu-dev-m systemd[1]: ollama.service: Consumed 1.736s CPU time. -- Boot bc6ec016c76f493a9ad5e1d74df3b7e2 -- Nov 01 06:21:09 ubuntu-dev-m systemd[1]: Started Ollama Service. Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.046-04:00 level=INFO source=routes.go:1524 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:INFO OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:1 http_proxy: https_proxy: no_proxy:]" Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.048-04:00 level=INFO source=images.go:522 msg="total blobs: 0" Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.048-04:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0" Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.049-04:00 level=INFO source=routes.go:1577 msg="Listening on 127.0.0.1:11434 (version 0.12.8)" Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.051-04:00 level=INFO source=runner.go:76 msg="discovering available GPUs..." Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.055-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 43111" Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.109-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 35973" Nov 01 06:21:10 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:10.148-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 37421" Nov 01 06:21:12 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:12.450-04:00 level=INFO source=runner.go:498 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs=map[] error="runner crashed" Nov 01 06:21:12 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:12.450-04:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="7.7 GiB" available="6.1 GiB" Nov 01 06:21:12 ubuntu-dev-m ollama[1729]: time=2025-11-01T06:21:12.450-04:00 level=INFO source=routes.go:1618 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB" Nov 01 15:21:30 ubuntu-dev-m systemd[1]: Stopping Ollama Service... Nov 01 15:21:30 ubuntu-dev-m systemd[1]: ollama.service: Deactivated successfully. Nov 01 15:21:30 ubuntu-dev-m systemd[1]: Stopped Ollama Service. Nov 01 15:21:30 ubuntu-dev-m systemd[1]: ollama.service: Consumed 2.624s CPU time. Log 2: OLLAMA_DEBUG=1 ollama serve time=2025-11-01T15:21:31.703-04:00 level=INFO source=routes.go:1524 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/malay2bfriend/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" time=2025-11-01T15:21:31.704-04:00 level=INFO source=images.go:522 msg="total blobs: 0" time=2025-11-01T15:21:31.704-04:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0" time=2025-11-01T15:21:31.704-04:00 level=INFO source=routes.go:1577 msg="Listening on 127.0.0.1:11434 (version 0.12.8)" time=2025-11-01T15:21:31.704-04:00 level=DEBUG source=sched.go:120 msg="starting llm scheduler" time=2025-11-01T15:21:31.704-04:00 level=INFO source=runner.go:76 msg="discovering available GPUs..." time=2025-11-01T15:21:31.705-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 46331" time=2025-11-01T15:21:31.705-04:00 level=DEBUG source=server.go:401 msg=subprocess OLLAMA_DEBUG=1 PATH=/home/malay2bfriend/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin:/home/malay2bfriend/.dotnet/tools LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/cuda_v13 OLLAMA_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/cuda_v13 time=2025-11-01T15:21:31.738-04:00 level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=33.277122ms OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/cuda_v13]" extra_envs=map[] time=2025-11-01T15:21:31.738-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 33055" time=2025-11-01T15:21:31.738-04:00 level=DEBUG source=server.go:401 msg=subprocess OLLAMA_DEBUG=1 PATH=/home/malay2bfriend/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin:/home/malay2bfriend/.dotnet/tools LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm time=2025-11-01T15:21:33.327-04:00 level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=1.589538316s OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs=map[] time=2025-11-01T15:21:33.328-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 36055" time=2025-11-01T15:21:33.328-04:00 level=DEBUG source=server.go:401 msg=subprocess OLLAMA_DEBUG=1 PATH=/home/malay2bfriend/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin:/home/malay2bfriend/.dotnet/tools LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/cuda_v12 OLLAMA_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/cuda_v12 time=2025-11-01T15:21:33.360-04:00 level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=33.026582ms OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/cuda_v12]" extra_envs=map[] time=2025-11-01T15:21:33.360-04:00 level=DEBUG source=runner.go:120 msg="evluating which if any devices to filter out" initial_count=2 time=2025-11-01T15:21:33.360-04:00 level=DEBUG source=runner.go:132 msg="verifying GPU is supported" library=/usr/local/lib/ollama/rocm description="AMD Radeon RX 6600 XT" compute=gfx1032 id=0 pci_id=0000:03:00.0 time=2025-11-01T15:21:33.361-04:00 level=DEBUG source=runner.go:132 msg="verifying GPU is supported" library=/usr/local/lib/ollama/rocm description="AMD Radeon RX 6700 XT" compute=gfx1031 id=1 pci_id=0000:06:00.0 time=2025-11-01T15:21:33.361-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 40599" time=2025-11-01T15:21:33.361-04:00 level=DEBUG source=server.go:401 msg=subprocess OLLAMA_DEBUG=1 PATH=/home/malay2bfriend/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin:/home/malay2bfriend/.dotnet/tools LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm ROCR_VISIBLE_DEVICES=0 GGML_CUDA_INIT=1 time=2025-11-01T15:21:33.361-04:00 level=INFO source=server.go:400 msg="starting runner" cmd="/usr/local/bin/ollama runner --ollama-engine --port 32795" time=2025-11-01T15:21:33.361-04:00 level=DEBUG source=server.go:401 msg=subprocess OLLAMA_DEBUG=1 PATH=/home/malay2bfriend/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin:/home/malay2bfriend/.dotnet/tools LD_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/local/lib/ollama:/usr/local/lib/ollama/rocm GGML_CUDA_INIT=1 ROCR_VISIBLE_DEVICES=1 time=2025-11-01T15:21:35.183-04:00 level=INFO source=runner.go:498 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:0]" error="runner crashed" time=2025-11-01T15:21:35.183-04:00 level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=1.822625782s OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:0]" time=2025-11-01T15:21:35.183-04:00 level=DEBUG source=runner.go:158 msg="filtering device which didn't fully initialize" id=0 libdir=/usr/local/lib/ollama/rocm pci_id=0000:03:00.0 library=ROCm time=2025-11-01T15:21:35.201-04:00 level=INFO source=runner.go:498 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:1]" error="runner crashed" time=2025-11-01T15:21:35.202-04:00 level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=1.840962743s OLLAMA_LIBRARY_PATH="[/usr/local/lib/ollama /usr/local/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:1]" time=2025-11-01T15:21:35.202-04:00 level=DEBUG source=runner.go:158 msg="filtering device which didn't fully initialize" id=1 libdir=/usr/local/lib/ollama/rocm pci_id=0000:06:00.0 library=ROCm time=2025-11-01T15:21:35.202-04:00 level=DEBUG source=runner.go:41 msg="GPU bootstrap discovery took" duration=3.497477921s time=2025-11-01T15:21:35.202-04:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="7.7 GiB" available="4.8 GiB" time=2025-11-01T15:21:35.202-04:00 level=INFO source=routes.go:1618 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB" **** Here you can see failure during GPU discovery. Let me know if you need any other log from my computer. ``` ### OS Linux ### GPU AMD ### CPU Intel ### Ollama version 0.12.8
GiteaMirror added the gpuamdbugneeds more info labels 2026-04-29 08:16:12 -05:00
Author
Owner

@rick-github commented on GitHub (Nov 1, 2025):

Set HSA_OVERRIDE_GFX_VERSION=10.3.0 in the server config.

<!-- gh-comment-id:3476739110 --> @rick-github commented on GitHub (Nov 1, 2025): Set `HSA_OVERRIDE_GFX_VERSION=10.3.0` in the [server config](https://github.com/ollama/ollama/blob/main/docs/faq.mdx#how-do-i-configure-ollama-server).
Author
Owner

@pdevine commented on GitHub (Nov 1, 2025):

@malay2bfriend can you update the issue if @rick-github 's suggestion works?

<!-- gh-comment-id:3476967437 --> @pdevine commented on GitHub (Nov 1, 2025): @malay2bfriend can you update the issue if @rick-github 's suggestion works?
Author
Owner

@ganakee commented on GitHub (Nov 12, 2025):

For me, AMD on Linux (25.10 for me but also 22.04) is broken and does not use any AMD GPU. This has been broken since 0.12.6.

<!-- gh-comment-id:3523250633 --> @ganakee commented on GitHub (Nov 12, 2025): For me, AMD on Linux (25.10 for me but also 22.04) is broken and does not use any AMD GPU. This has been broken since 0.12.6.
Author
Owner

@rick-github commented on GitHub (Nov 13, 2025):

@ganakee Server log will help in debugging.

<!-- gh-comment-id:3527419106 --> @rick-github commented on GitHub (Nov 13, 2025): @ganakee [Server log](https://docs.ollama.com/troubleshooting) will help in debugging.
Author
Owner

@malay2bfriend commented on GitHub (Nov 13, 2025):

Thank you @rick-github, using you recommended setting I can run ollama using my AMD GPU.

<!-- gh-comment-id:3529675134 --> @malay2bfriend commented on GitHub (Nov 13, 2025): Thank you @rick-github, using you recommended setting I can run ollama using my AMD GPU.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#55065