[GH-ISSUE #7279] Ollama Docker image 0.4.0-rc3-rocm crashes due to missing shared library #4625

Closed
opened 2026-04-12 15:32:15 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @ic4-y on GitHub (Oct 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7279

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

I just tried out the latest 0.4.0-rc3-rocm docker image and the ollama_llama_server crashes with

ollama-rocm | /usr/lib/ollama/runners/rocm/ollama_llama_server: error while loading shared libraries: libelf.so.1: cannot open shared object file: No such file or directory

I am running this on a Radeon Pro W6800, the latest stable release 0.3.13-rocm works just fine.

Here is a slightly bigger section of the debug log in case that is helpful.

ollama-rocm | time=2024-10-20T17:07:58.034Z level=INFO source=llama-server.go:355 msg="starting llama server" cmd="/usr/lib/ollama/runners/rocm/ollama_llama_server --model /root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa --ctx-size 8192 --batch-size 512 --embedding --n-gpu-layers 33 --verbose --threads 12 --parallel 4 --port 46723"
ollama-rocm | time=2024-10-20T17:07:58.034Z level=DEBUG source=llama-server.go:372 msg=subprocess environment="[PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HSA_OVERRIDE_GFX_VERSION=10.3.0 ROCR_VISIBLE_DEVICES=0 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/runners/rocm HIP_VISIBLE_DEVICES=0]"
ollama-rocm | time=2024-10-20T17:07:58.037Z level=INFO source=sched.go:450 msg="loaded runners" count=1
ollama-rocm | time=2024-10-20T17:07:58.037Z level=INFO source=llama-server.go:534 msg="waiting for llama runner to start responding"
ollama-rocm | time=2024-10-20T17:07:58.037Z level=INFO source=llama-server.go:568 msg="waiting for server to become available" status="llm server error"
ollama-rocm | /usr/lib/ollama/runners/rocm/ollama_llama_server: error while loading shared libraries: libelf.so.1: cannot open shared object file: No such file or directory
ollama-rocm | time=2024-10-20T17:07:58.288Z level=ERROR source=sched.go:456 msg="error loading llama server" error="llama runner process has terminated: exit status 127"
ollama-rocm | time=2024-10-20T17:07:58.288Z level=DEBUG source=sched.go:459 msg="triggering expiration for failed load" model=/root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa
ollama-rocm | time=2024-10-20T17:07:58.288Z level=DEBUG source=sched.go:361 msg="runner expired event received" modelPath=/root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa
ollama-rocm | time=2024-10-20T17:07:58.288Z level=DEBUG source=sched.go:376 msg="got lock to unload" modelPath=/root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa
ollama-rocm | [GIN] 2024/10/20 - 17:07:58 | 500 |  311.617978ms |       127.0.0.1 | POST     "/api/generate"

OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.4.0-rc3-rocm (Docker image)

Originally created by @ic4-y on GitHub (Oct 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7279 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? I just tried out the latest 0.4.0-rc3-rocm docker image and the `ollama_llama_server` crashes with ```ollama-rocm | /usr/lib/ollama/runners/rocm/ollama_llama_server: error while loading shared libraries: libelf.so.1: cannot open shared object file: No such file or directory``` I am running this on a Radeon Pro W6800, the latest stable release `0.3.13-rocm` works just fine. Here is a slightly bigger section of the debug log in case that is helpful. ``` ollama-rocm | time=2024-10-20T17:07:58.034Z level=INFO source=llama-server.go:355 msg="starting llama server" cmd="/usr/lib/ollama/runners/rocm/ollama_llama_server --model /root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa --ctx-size 8192 --batch-size 512 --embedding --n-gpu-layers 33 --verbose --threads 12 --parallel 4 --port 46723" ollama-rocm | time=2024-10-20T17:07:58.034Z level=DEBUG source=llama-server.go:372 msg=subprocess environment="[PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HSA_OVERRIDE_GFX_VERSION=10.3.0 ROCR_VISIBLE_DEVICES=0 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/runners/rocm HIP_VISIBLE_DEVICES=0]" ollama-rocm | time=2024-10-20T17:07:58.037Z level=INFO source=sched.go:450 msg="loaded runners" count=1 ollama-rocm | time=2024-10-20T17:07:58.037Z level=INFO source=llama-server.go:534 msg="waiting for llama runner to start responding" ollama-rocm | time=2024-10-20T17:07:58.037Z level=INFO source=llama-server.go:568 msg="waiting for server to become available" status="llm server error" ollama-rocm | /usr/lib/ollama/runners/rocm/ollama_llama_server: error while loading shared libraries: libelf.so.1: cannot open shared object file: No such file or directory ollama-rocm | time=2024-10-20T17:07:58.288Z level=ERROR source=sched.go:456 msg="error loading llama server" error="llama runner process has terminated: exit status 127" ollama-rocm | time=2024-10-20T17:07:58.288Z level=DEBUG source=sched.go:459 msg="triggering expiration for failed load" model=/root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa ollama-rocm | time=2024-10-20T17:07:58.288Z level=DEBUG source=sched.go:361 msg="runner expired event received" modelPath=/root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa ollama-rocm | time=2024-10-20T17:07:58.288Z level=DEBUG source=sched.go:376 msg="got lock to unload" modelPath=/root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa ollama-rocm | [GIN] 2024/10/20 - 17:07:58 | 500 | 311.617978ms | 127.0.0.1 | POST "/api/generate" ``` ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.4.0-rc3-rocm (Docker image)
GiteaMirror added the dockerbug labels 2026-04-12 15:32:15 -05:00
Author
Owner

@robbiemu commented on GitHub (Oct 20, 2024):

check if libelf.so.1 ispresent on your Linux machine (it is usually cated in /usr/lib or /usr/lib64). if it is not present, install libutils package

<!-- gh-comment-id:2425136500 --> @robbiemu commented on GitHub (Oct 20, 2024): check if libelf.so.1 ispresent on your Linux machine (it is usually cated in /usr/lib or /usr/lib64). if it is not present, install libutils package
Author
Owner

@ic4-y commented on GitHub (Oct 20, 2024):

I am running this in docker. Maybe I am missing something, but why would it matter if my host machine has a linked library or not? Should I not look inside the container rather?

That being said, I am running NixOS on the host. Linked libraries might not be where you'd expect them to be. I am running docker containers here so that I have an independent environment from my host machine.

Edit: Turns out the hint was correct, if installing apt install libelf-dev INSIDE THE CONTAINER that solves the issue.

That however is a bug in the image on dockerhub.

<!-- gh-comment-id:2425157998 --> @ic4-y commented on GitHub (Oct 20, 2024): I am running this in docker. Maybe I am missing something, but why would it matter if my host machine has a linked library or not? Should I not look inside the container rather? That being said, I am running NixOS on the host. Linked libraries might not be where you'd expect them to be. I am running docker containers here so that I have an independent environment from my host machine. Edit: Turns out the hint was correct, if installing `apt install libelf-dev` INSIDE THE CONTAINER that solves the issue. That however is a bug in the image on dockerhub.
Author
Owner

@skobkin commented on GitHub (Oct 22, 2024):

Looks like my #7320 is probably a duplicate of this.

So, just waiting for fixed image then?

<!-- gh-comment-id:2429774717 --> @skobkin commented on GitHub (Oct 22, 2024): Looks like my #7320 is probably a duplicate of this. So, just waiting for fixed image then?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4625