[GH-ISSUE #5142] Segmentation fault on Ubuntu 24.04 LXC container #29003

Closed
opened 2026-04-22 07:35:30 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @MmDawN on GitHub (Jun 19, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5142

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

My runtime environment is based on an LXC container running Ubuntu 24.04 LTS.

After the installation of ollama v0.1.44, running ollama in bash returns a Segmentation fault error.

The journalctl -u ollama command reveals the following recurring error and indicates constant restarting:

ollama.service: Main process exited, code=killed, status=11/SEGV
ollama.service: Failed with result 'signal'.

See the attached image for reference:

image

I'm hoping someone can assist me in resolving this issue.

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.1.44

Originally created by @MmDawN on GitHub (Jun 19, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5142 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? My runtime environment is based on an LXC container running Ubuntu 24.04 LTS. After the installation of ollama v0.1.44, running `ollama` in bash returns a `Segmentation fault` error. The `journalctl -u ollama` command reveals the following recurring error and indicates constant restarting: ``` ollama.service: Main process exited, code=killed, status=11/SEGV ollama.service: Failed with result 'signal'. ``` See the attached image for reference: > ![image](https://github.com/ollama/ollama/assets/40926229/73a15947-2c9a-4d16-8c75-011584d168b2) I'm hoping someone can assist me in resolving this issue. ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.44
GiteaMirror added the bug label 2026-04-22 07:35:30 -05:00
Author
Owner

@dhiltgen commented on GitHub (Jun 19, 2024):

Can you try running it in the foreground with debug so we can see more logs on why it's failing?

sudo systemctl stop ollama
OLLAMA_DEBUG=1 ollama serve 2>&1 | tee server.log

If that doesn't crash immediately, try to load a model, and then share the server log if it has problems.

<!-- gh-comment-id:2179242346 --> @dhiltgen commented on GitHub (Jun 19, 2024): Can you try running it in the foreground with debug so we can see more logs on why it's failing? ``` sudo systemctl stop ollama OLLAMA_DEBUG=1 ollama serve 2>&1 | tee server.log ``` If that doesn't crash immediately, try to load a model, and then share the server log if it has problems.
Author
Owner

@MmDawN commented on GitHub (Jun 20, 2024):

Can you try running it in the foreground with debug so we can see more logs on why it's failing?

sudo systemctl stop ollama
OLLAMA_DEBUG=1 ollama serve 2>&1 | tee server.log

If that doesn't crash immediately, try to load a model, and then share the server log if it has problems.

I followed the command you provided, but there is no log output in the server.log file.

image

<!-- gh-comment-id:2179651554 --> @MmDawN commented on GitHub (Jun 20, 2024): > Can you try running it in the foreground with debug so we can see more logs on why it's failing? > > ``` > sudo systemctl stop ollama > OLLAMA_DEBUG=1 ollama serve 2>&1 | tee server.log > ``` > > If that doesn't crash immediately, try to load a model, and then share the server log if it has problems. I followed the command you provided, but there is no log output in the `server.log` file. ![image](https://github.com/ollama/ollama/assets/40926229/a154b91a-719b-46cd-b321-77b0872000a7)
Author
Owner

@dhiltgen commented on GitHub (Jun 20, 2024):

Oh, the ollama binary itself immediately segfaults. Possible scenarios are the binary got corrupted somehow, or there's a system dependency library thats missing maybe. Can you run the following to help narrow this down?

file /usr/local/bin/ollama
ldd /usr/local/bin/ollama
sha256sum /usr/local/bin/ollama

We publish the checksums on the release page and 0.1.44's linux binary should be 748646f3fce6736025fd79fb0d4b81ff940d54410022dc28563b0db6a6d84fae

<!-- gh-comment-id:2180973694 --> @dhiltgen commented on GitHub (Jun 20, 2024): Oh, the ollama binary itself immediately segfaults. Possible scenarios are the binary got corrupted somehow, or there's a system dependency library thats missing maybe. Can you run the following to help narrow this down? ``` file /usr/local/bin/ollama ldd /usr/local/bin/ollama sha256sum /usr/local/bin/ollama ``` We publish the checksums on the release page and 0.1.44's linux binary should be `748646f3fce6736025fd79fb0d4b81ff940d54410022dc28563b0db6a6d84fae`
Author
Owner

@MmDawN commented on GitHub (Jun 20, 2024):

Oh, the ollama binary itself immediately segfaults. Possible scenarios are the binary got corrupted somehow, or there's a system dependency library thats missing maybe. Can you run the following to help narrow this down?

file /usr/local/bin/ollama
ldd /usr/local/bin/ollama
sha256sum /usr/local/bin/ollama

We publish the checksums on the release page and 0.1.44's linux binary should be 748646f3fce6736025fd79fb0d4b81ff940d54410022dc28563b0db6a6d84fae

Oh, now there are some errors:
image

<!-- gh-comment-id:2181014455 --> @MmDawN commented on GitHub (Jun 20, 2024): > Oh, the ollama binary itself immediately segfaults. Possible scenarios are the binary got corrupted somehow, or there's a system dependency library thats missing maybe. Can you run the following to help narrow this down? > > ``` > file /usr/local/bin/ollama > ldd /usr/local/bin/ollama > sha256sum /usr/local/bin/ollama > ``` > > We publish the checksums on the release page and 0.1.44's linux binary should be `748646f3fce6736025fd79fb0d4b81ff940d54410022dc28563b0db6a6d84fae` Oh, now there are some errors: ![image](https://github.com/ollama/ollama/assets/40926229/00dce257-6c0f-4860-bf50-9e97645b2aef)
Author
Owner

@dhiltgen commented on GitHub (Jun 20, 2024):

ldd shouldn't exit with an error, but you could try running it against other binaries on your system to compare the behavior, and the checksum isn't right if you did in fact install 0.1.44. Maybe your filesystem is corrupt or you have a failing drive? Check other system logs to see if there are other errors being reported.

<!-- gh-comment-id:2181283535 --> @dhiltgen commented on GitHub (Jun 20, 2024): ldd shouldn't exit with an error, but you could try running it against other binaries on your system to compare the behavior, and the checksum isn't right if you did in fact install 0.1.44. Maybe your filesystem is corrupt or you have a failing drive? Check other system logs to see if there are other errors being reported.
Author
Owner

@luojiyin1987 commented on GitHub (Jun 21, 2024):

What virtualisation solution are you using ? @MmDawN

<!-- gh-comment-id:2182230382 --> @luojiyin1987 commented on GitHub (Jun 21, 2024): What virtualisation solution are you using ? @MmDawN
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29003