[GH-ISSUE #2823] rocm crashes on Illegal seek for GPU arch : gfx1032 #1715

Closed
opened 2026-04-12 11:41:31 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @turlapati on GitHub (Feb 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2823

Originally assigned to: @dhiltgen on GitHub.

user@HTML:~$ ollama run gemma
Error: Post "http://127.0.0.1:11434/api/chat": EOF

...
crash_0.1.27_gemma_rcom.txt

loading library /tmp/ollama3347055972/rocm_v6/libext_server.so
time=2024-02-28T20:59:58.907-05:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama3347055972/rocm_v6/libext_server.so"
time=2024-02-28T20:59:58.907-05:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server"

rocBLAS error: Cannot read /opt/rocm/lib/rocblas/library/TensileLibrary.dat: Illegal seek for GPU arch : gfx1032
free(): invalid pointer
SIGABRT: abort
PC=0x7fcc860969fc m=16 sigcode=18446744073709551610
signal arrived during cgo execution

Originally created by @turlapati on GitHub (Feb 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2823 Originally assigned to: @dhiltgen on GitHub. user@HTML:~$ ollama run gemma Error: Post "http://127.0.0.1:11434/api/chat": EOF ... [crash_0.1.27_gemma_rcom.txt](https://github.com/ollama/ollama/files/14441945/crash_0.1.27_gemma_rcom.txt) loading library /tmp/ollama3347055972/rocm_v6/libext_server.so time=2024-02-28T20:59:58.907-05:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama3347055972/rocm_v6/libext_server.so" time=2024-02-28T20:59:58.907-05:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server" rocBLAS error: Cannot read /opt/rocm/lib/rocblas/library/TensileLibrary.dat: Illegal seek for GPU arch : gfx1032 free(): invalid pointer SIGABRT: abort PC=0x7fcc860969fc m=16 sigcode=18446744073709551610 signal arrived during cgo execution
GiteaMirror added the bugamd labels 2026-04-12 11:41:31 -05:00
Author
Owner

@turlapati commented on GitHub (Feb 29, 2024):

ROCm installed using instructions: https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html#package-man-ubuntu

<!-- gh-comment-id:1970271893 --> @turlapati commented on GitHub (Feb 29, 2024): ROCm installed using instructions: https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html#package-man-ubuntu
Author
Owner

@pdevine commented on GitHub (Mar 1, 2024):

cc @dhiltgen

<!-- gh-comment-id:1972283706 --> @pdevine commented on GitHub (Mar 1, 2024): cc @dhiltgen
Author
Owner

@dhiltgen commented on GitHub (Mar 1, 2024):

Unfortunately AMD has not released ROCm support for these GPUs. https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html

We're working on improvements to detect this unsupported GPU scenario and gracefully fallback to CPU mode with a log message explaining why, and guidance on how you may be able to workaround this by forcing rocm to load a different version.

You may be able to get it working by setting HSA_OVERRIDE_GFX_VERSION to a supported version that's close, for example, HSA_OVERRIDE_GFX_VERSION=10.3.0 may work for your GPU.

<!-- gh-comment-id:1973629452 --> @dhiltgen commented on GitHub (Mar 1, 2024): Unfortunately AMD has not released ROCm support for these GPUs. https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html We're working on improvements to detect this unsupported GPU scenario and gracefully fallback to CPU mode with a log message explaining why, and guidance on how you may be able to workaround this by forcing rocm to load a different version. You may be able to get it working by setting `HSA_OVERRIDE_GFX_VERSION` to a supported version that's close, for example, `HSA_OVERRIDE_GFX_VERSION=10.3.0` may work for your GPU.
Author
Owner

@turlapati commented on GitHub (Mar 2, 2024):

Thanks for looking into the issue. Setting HSA_OVERRIDE_GFX_VERSION=10.3.0 allows Ollama to run properly, but I start to see my screen flickering (same as the issue: https://www.reddit.com/r/LocalLLaMA/comments/18nfwy5/screen_flickering_in_linux_when_offloading_layers/?rdt=46343)...

<!-- gh-comment-id:1974171120 --> @turlapati commented on GitHub (Mar 2, 2024): Thanks for looking into the issue. Setting HSA_OVERRIDE_GFX_VERSION=10.3.0 allows Ollama to run properly, but I start to see my screen flickering (same as the issue: https://www.reddit.com/r/LocalLLaMA/comments/18nfwy5/screen_flickering_in_linux_when_offloading_layers/?rdt=46343)...
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1715