[GH-ISSUE #6165] No devices found using AMD gpus #3852

Closed
opened 2026-04-12 14:41:31 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @cinglish on GitHub (Aug 4, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6165

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

Getting the following error when loading models with AMD gpus (Instinct MI60s):

rocBLAS error: Could not initialize Tensile host: No devices found

I have 4 devices allocated to the container and it seems to still discover them at startup:

time=2024-08-04T22:17:48.256Z level=INFO source=gpu.go:205 msg="looking for compatible GPUs"
time=2024-08-04T22:17:48.264Z level=INFO source=amd_linux.go:345 msg="amdgpu is supported" gpu=0 gpu_type=gfx906
time=2024-08-04T22:17:48.264Z level=WARN source=amd_linux.go:201 msg="amdgpu too old gfx000" gpu=1
time=2024-08-04T22:17:48.264Z level=WARN source=amd_linux.go:201 msg="amdgpu too old gfx000" gpu=2
time=2024-08-04T22:17:48.264Z level=WARN source=amd_linux.go:201 msg="amdgpu too old gfx000" gpu=3
time=2024-08-04T22:17:48.265Z level=WARN source=amd_linux.go:201 msg="amdgpu too old gfx000" gpu=4
time=2024-08-04T22:17:48.265Z level=INFO source=amd_linux.go:345 msg="amdgpu is supported" gpu=5 gpu_type=gfx906
time=2024-08-04T22:17:48.266Z level=INFO source=amd_linux.go:345 msg="amdgpu is supported" gpu=6 gpu_type=gfx906
time=2024-08-04T22:17:48.268Z level=INFO source=amd_linux.go:345 msg="amdgpu is supported" gpu=7 gpu_type=gfx906

It is working fine with the 0.3.1-rocm image, but seeing this behavior in the newest versions.

OS

Docker

GPU

AMD

CPU

Intel

Ollama version

0.3.2

Originally created by @cinglish on GitHub (Aug 4, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6165 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? Getting the following error when loading models with AMD gpus (Instinct MI60s): ``` rocBLAS error: Could not initialize Tensile host: No devices found ``` I have 4 devices allocated to the container and it seems to still discover them at startup: ``` time=2024-08-04T22:17:48.256Z level=INFO source=gpu.go:205 msg="looking for compatible GPUs" time=2024-08-04T22:17:48.264Z level=INFO source=amd_linux.go:345 msg="amdgpu is supported" gpu=0 gpu_type=gfx906 time=2024-08-04T22:17:48.264Z level=WARN source=amd_linux.go:201 msg="amdgpu too old gfx000" gpu=1 time=2024-08-04T22:17:48.264Z level=WARN source=amd_linux.go:201 msg="amdgpu too old gfx000" gpu=2 time=2024-08-04T22:17:48.264Z level=WARN source=amd_linux.go:201 msg="amdgpu too old gfx000" gpu=3 time=2024-08-04T22:17:48.265Z level=WARN source=amd_linux.go:201 msg="amdgpu too old gfx000" gpu=4 time=2024-08-04T22:17:48.265Z level=INFO source=amd_linux.go:345 msg="amdgpu is supported" gpu=5 gpu_type=gfx906 time=2024-08-04T22:17:48.266Z level=INFO source=amd_linux.go:345 msg="amdgpu is supported" gpu=6 gpu_type=gfx906 time=2024-08-04T22:17:48.268Z level=INFO source=amd_linux.go:345 msg="amdgpu is supported" gpu=7 gpu_type=gfx906 ``` It is working fine with the `0.3.1-rocm` image, but seeing this behavior in the newest versions. ### OS Docker ### GPU AMD ### CPU Intel ### Ollama version 0.3.2
GiteaMirror added the dockerlinuxamdbug labels 2026-04-12 14:41:31 -05:00
Author
Owner

@dhiltgen commented on GitHub (Aug 9, 2024):

Can you run the container with -e OLLAMA_DEBUG=1 and share the more complete log of startup and attempting to load a model?

<!-- gh-comment-id:2278726554 --> @dhiltgen commented on GitHub (Aug 9, 2024): Can you run the container with `-e OLLAMA_DEBUG=1` and share the more complete log of startup and attempting to load a model?
Author
Owner

@dhiltgen commented on GitHub (Sep 5, 2024):

If you're still having trouble, please make sure you pull the latest image, and if that doesn't resolve it, please share a more complete log as requested above and I'll reopen the issue.

<!-- gh-comment-id:2332444060 --> @dhiltgen commented on GitHub (Sep 5, 2024): If you're still having trouble, please make sure you pull the latest image, and if that doesn't resolve it, please share a more complete log as requested above and I'll reopen the issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3852