[GH-ISSUE #6573] Getting Error: llama runner process has terminated: exit status 127 #66176

Closed
opened 2026-05-04 00:29:11 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Blasserman on GitHub (Aug 30, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6573

What is the issue?

On two different aarch64 SBC's running Debian. One running Bookworm, one running bullseye. Both get the error when loading the model llama3.1 into memory with Ollama 0.3.8. The previous version ollama worked fine.

dave@ai:$ ollama -v
ollama version is 0.3.8
dave@ai:
$ ollama run llama3.1
Error: llama runner process has terminated: exit status 127
dave@ai:~$

OS

Linux

GPU

Other

CPU

Other

Ollama version

0.3.8

Originally created by @Blasserman on GitHub (Aug 30, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6573 ### What is the issue? On two different aarch64 SBC's running Debian. One running Bookworm, one running bullseye. Both get the error when loading the model llama3.1 into memory with Ollama 0.3.8. The previous version ollama worked fine. dave@ai:~$ ollama -v ollama version is 0.3.8 dave@ai:~$ ollama run llama3.1 Error: llama runner process has terminated: exit status 127 dave@ai:~$ ### OS Linux ### GPU Other ### CPU Other ### Ollama version 0.3.8
GiteaMirror added the bug label 2026-05-04 00:29:11 -05:00
Author
Owner

@Blasserman commented on GitHub (Aug 30, 2024):

They are both aarch64 CPU's. one is Rockchip, one is Raspberry Pi.

<!-- gh-comment-id:2322350932 --> @Blasserman commented on GitHub (Aug 30, 2024): They are both aarch64 CPU's. one is Rockchip, one is Raspberry Pi.
Author
Owner

@pdevine commented on GitHub (Aug 30, 2024):

Dupe of #6541

<!-- gh-comment-id:2322387619 --> @pdevine commented on GitHub (Aug 30, 2024): Dupe of #6541
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66176