[GH-ISSUE #11255] llama runner process has terminated: exit status 2 when use ollama version is 0.9.3 #69475

Closed
opened 2026-05-04 18:13:45 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @caiyongnjupt on GitHub (Jul 1, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11255

What is the issue?

error log

time=2025-07-01T20:41:24.587+08:00 level=INFO source=server.go:598 msg="waiting for llama runner to start responding"
time=2025-07-01T20:41:24.587+08:00 level=INFO source=server.go:632 msg="waiting for server to become available" status="llm server error"
time=2025-07-01T20:41:24.617+08:00 level=INFO source=runner.go:815 msg="starting go runner"
Exception 0xc0000005 0x0 0x0 0x7fffb8f22f58
PC=0x7fffb8f22f58
signal arrived during external code execution

runtime.cgocall(0x7ff6979bc470, 0xc0005075a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/cgocall.go:167 +0x3e fp=0xc000507578 sp=0xc000507510 pc=0x7ff696cd2dbe
github.com/ollama/ollama/ml/backend/ggml/ggml/src._Cfunc_ggml_backend_load_all_from_path(0x19605379220)
_cgo_gotypes.go:199 +0x45 fp=0xc0005075a0 sp=0xc000507578 pc=0x7ff697095c05
github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.func1.1({0xc00002ed20, 0x28})
C:/a/ollama/ollama/ml/backend/ggml/ggml/src/ggml.go:97 +0xf5 fp=0xc000507638 sp=0xc0005075a0 pc=0x7ff697095635

log01.txt

Relevant log output


OS

win11

GPU

No response

CPU

No response

Ollama version

0.9.3

Originally created by @caiyongnjupt on GitHub (Jul 1, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11255 ### What is the issue? error log time=2025-07-01T20:41:24.587+08:00 level=INFO source=server.go:598 msg="waiting for llama runner to start responding" time=2025-07-01T20:41:24.587+08:00 level=INFO source=server.go:632 msg="waiting for server to become available" status="llm server error" time=2025-07-01T20:41:24.617+08:00 level=INFO source=runner.go:815 msg="starting go runner" Exception 0xc0000005 0x0 0x0 0x7fffb8f22f58 PC=0x7fffb8f22f58 signal arrived during external code execution runtime.cgocall(0x7ff6979bc470, 0xc0005075a0) C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/cgocall.go:167 +0x3e fp=0xc000507578 sp=0xc000507510 pc=0x7ff696cd2dbe github.com/ollama/ollama/ml/backend/ggml/ggml/src._Cfunc_ggml_backend_load_all_from_path(0x19605379220) _cgo_gotypes.go:199 +0x45 fp=0xc0005075a0 sp=0xc000507578 pc=0x7ff697095c05 github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.func1.1({0xc00002ed20, 0x28}) C:/a/ollama/ollama/ml/backend/ggml/ggml/src/ggml.go:97 +0xf5 fp=0xc000507638 sp=0xc0005075a0 pc=0x7ff697095635 [log01.txt](https://github.com/user-attachments/files/20998755/log01.txt) ### Relevant log output ```shell ``` ### OS win11 ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.9.3
GiteaMirror added the bug label 2026-05-04 18:13:45 -05:00
Author
Owner

@rick-github commented on GitHub (Jul 1, 2025):

Possibly #11209

<!-- gh-comment-id:3024260420 --> @rick-github commented on GitHub (Jul 1, 2025): Possibly #11209
Author
Owner

@jmorganca commented on GitHub (Jul 2, 2025):

This should be fixed as in https://github.com/ollama/ollama/issues/11209#issuecomment-3025824850

Sorry for the issue! And thanks for the link @rick-github

<!-- gh-comment-id:3025952576 --> @jmorganca commented on GitHub (Jul 2, 2025): This should be fixed as in https://github.com/ollama/ollama/issues/11209#issuecomment-3025824850 Sorry for the issue! And thanks for the link @rick-github
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#69475