[GH-ISSUE #12160] Issue running ollama on hybrid AMD/Nvidia GPU setup. #54598

Closed
opened 2026-04-29 06:30:49 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Kraust on GitHub (Sep 2, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12160

What is the issue?

Sorry I am new at this and trying to see if local models are a good fit for my use case. I am trying to run with a hybrid Nvidia + AMD setup on OpenSUSE Tumbleweed, and I can get the Nvidia GPU running correctly, but when I also add the AMD iGPU (I assume this is faster than offloading to CPU) there's a panic in a go process.

Relevant log output

https://gist.github.com/Kraust/1034e5f1d9cd35206a75eb4a10be6358

For people who may want to search for this exact issue in the future:

Sep 02 16:13:18 exhaust ollama[2205776]: SIGSEGV: segmentation violation
Sep 02 16:13:18 exhaust ollama[2205776]: PC=0x7fdd01ef9b08 m=0 sigcode=1 addr=0x10
Sep 02 16:13:18 exhaust ollama[2205776]: signal arrived during cgo execution
Sep 02 16:13:18 exhaust ollama[2205776]: goroutine 1 gp=0xc000002380 m=0 mp=0x55f96798ad20 [syscall]:
Sep 02 16:13:18 exhaust ollama[2205776]: runtime.cgocall(0x55f96692d1c0, 0xc000131710)
Sep 02 16:13:18 exhaust ollama[2205776]:         runtime/cgocall.go:167 +0x4b fp=0xc0001316e8 sp=0xc0001316b0 pc=0x55f965c3d3eb
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src._Cfunc_ggml_backend_load_all_from_path(0x55f98db12970)
Sep 02 16:13:18 exhaust ollama[2205776]:         _cgo_gotypes.go:195 +0x3e fp=0xc000131710 sp=0xc0001316e8 pc=0x55f965fea67e
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.func1.1({0xc000038064, 0x15})
Sep 02 16:13:18 exhaust ollama[2205776]:         github.com/ollama/ollama/ml/backend/ggml/ggml/src/ggml.go:97 +0xf5 fp=0xc0001317a8 sp=0xc000131710 pc=0x55f965fea115
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.func1()
Sep 02 16:13:18 exhaust ollama[2205776]:         github.com/ollama/ollama/ml/backend/ggml/ggml/src/ggml.go:98 +0x526 fp=0xc000131a38 sp=0xc0001317a8 pc=0x55f965fe9f66
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.OnceFunc.func2()
Sep 02 16:13:18 exhaust ollama[2205776]:         sync/oncefunc.go:27 +0x62 fp=0xc000131a80 sp=0xc000131a38 pc=0x55f965fe9962
Sep 02 16:13:18 exhaust ollama[2205776]: sync.(*Once).doSlow(0x0?, 0x0?)
Sep 02 16:13:18 exhaust ollama[2205776]:         sync/once.go:78 +0xab fp=0xc000131ad8 sp=0xc000131a80 pc=0x55f965c524ab
Sep 02 16:13:18 exhaust ollama[2205776]: sync.(*Once).Do(0x0?, 0x0?)
Sep 02 16:13:18 exhaust ollama[2205776]:         sync/once.go:69 +0x19 fp=0xc000131af8 sp=0xc000131ad8 pc=0x55f965c523d9
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.OnceFunc.func3()
Sep 02 16:13:18 exhaust ollama[2205776]:         sync/oncefunc.go:32 +0x2d fp=0xc000131b28 sp=0xc000131af8 pc=0x55f965fe98cd
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/llama.BackendInit()
Sep 02 16:13:18 exhaust ollama[2205776]:         github.com/ollama/ollama/llama/llama.go:61 +0x16 fp=0xc000131b38 sp=0xc000131b28 pc=0x55f965feec36
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/runner/llamarunner.Execute({0xc000034260, 0x4, 0x4})
Sep 02 16:13:18 exhaust ollama[2205776]:         github.com/ollama/ollama/runner/llamarunner/runner.go:866 +0x395 fp=0xc000131d08 sp=0xc000131b38 pc=0x55f9660ba935
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/runner.Execute({0xc000034250?, 0x0?, 0x0?})
Sep 02 16:13:18 exhaust ollama[2205776]:         github.com/ollama/ollama/runner/runner.go:22 +0xd4 fp=0xc000131d30 sp=0xc000131d08 pc=0x55f966145434
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/cmd.NewCLI.func2(0xc00004f500?, {0x55f966c00081?, 0x4?, 0x55f966c00085?})
Sep 02 16:13:18 exhaust ollama[2205776]:         github.com/ollama/ollama/cmd/cmd.go:1583 +0x45 fp=0xc000131d58 sp=0xc000131d30 pc=0x55f9668aa6e5
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra.(*Command).execute(0xc0004fcf08, {0xc0004593c0, 0x4, 0x4})
Sep 02 16:13:18 exhaust ollama[2205776]:         github.com/spf13/cobra@v1.7.0/command.go:940 +0x85c fp=0xc000131e78 sp=0xc000131d58 pc=0x55f965db889c
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra.(*Command).ExecuteC(0xc0004ce908)
Sep 02 16:13:18 exhaust ollama[2205776]:         github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5 fp=0xc000131f30 sp=0xc000131e78 pc=0x55f965db90e5
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra.(*Command).Execute(...)
Sep 02 16:13:18 exhaust ollama[2205776]:         github.com/spf13/cobra@v1.7.0/command.go:992
Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra.(*Command).ExecuteContext(...)
Sep 02 16:13:18 exhaust ollama[2205776]:         github.com/spf13/cobra@v1.7.0/command.go:985
Sep 02 16:13:18 exhaust ollama[2205776]: main.main()

Note my service:

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="OLLAMA_DEBUG=1"
Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"

[Install]
WantedBy=default.target

OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.11.8

Originally created by @Kraust on GitHub (Sep 2, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12160 ### What is the issue? Sorry I am new at this and trying to see if local models are a good fit for my use case. I am trying to run with a hybrid Nvidia + AMD setup on OpenSUSE Tumbleweed, and I can get the Nvidia GPU running correctly, but when I also add the AMD iGPU (I assume this is faster than offloading to CPU) there's a panic in a go process. ### Relevant log output https://gist.github.com/Kraust/1034e5f1d9cd35206a75eb4a10be6358 For people who may want to search for this exact issue in the future: ```shell Sep 02 16:13:18 exhaust ollama[2205776]: SIGSEGV: segmentation violation Sep 02 16:13:18 exhaust ollama[2205776]: PC=0x7fdd01ef9b08 m=0 sigcode=1 addr=0x10 Sep 02 16:13:18 exhaust ollama[2205776]: signal arrived during cgo execution Sep 02 16:13:18 exhaust ollama[2205776]: goroutine 1 gp=0xc000002380 m=0 mp=0x55f96798ad20 [syscall]: Sep 02 16:13:18 exhaust ollama[2205776]: runtime.cgocall(0x55f96692d1c0, 0xc000131710) Sep 02 16:13:18 exhaust ollama[2205776]: runtime/cgocall.go:167 +0x4b fp=0xc0001316e8 sp=0xc0001316b0 pc=0x55f965c3d3eb Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src._Cfunc_ggml_backend_load_all_from_path(0x55f98db12970) Sep 02 16:13:18 exhaust ollama[2205776]: _cgo_gotypes.go:195 +0x3e fp=0xc000131710 sp=0xc0001316e8 pc=0x55f965fea67e Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.func1.1({0xc000038064, 0x15}) Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src/ggml.go:97 +0xf5 fp=0xc0001317a8 sp=0xc000131710 pc=0x55f965fea115 Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.func1() Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src/ggml.go:98 +0x526 fp=0xc000131a38 sp=0xc0001317a8 pc=0x55f965fe9f66 Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.OnceFunc.func2() Sep 02 16:13:18 exhaust ollama[2205776]: sync/oncefunc.go:27 +0x62 fp=0xc000131a80 sp=0xc000131a38 pc=0x55f965fe9962 Sep 02 16:13:18 exhaust ollama[2205776]: sync.(*Once).doSlow(0x0?, 0x0?) Sep 02 16:13:18 exhaust ollama[2205776]: sync/once.go:78 +0xab fp=0xc000131ad8 sp=0xc000131a80 pc=0x55f965c524ab Sep 02 16:13:18 exhaust ollama[2205776]: sync.(*Once).Do(0x0?, 0x0?) Sep 02 16:13:18 exhaust ollama[2205776]: sync/once.go:69 +0x19 fp=0xc000131af8 sp=0xc000131ad8 pc=0x55f965c523d9 Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.OnceFunc.func3() Sep 02 16:13:18 exhaust ollama[2205776]: sync/oncefunc.go:32 +0x2d fp=0xc000131b28 sp=0xc000131af8 pc=0x55f965fe98cd Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/llama.BackendInit() Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/llama/llama.go:61 +0x16 fp=0xc000131b38 sp=0xc000131b28 pc=0x55f965feec36 Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/runner/llamarunner.Execute({0xc000034260, 0x4, 0x4}) Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/runner/llamarunner/runner.go:866 +0x395 fp=0xc000131d08 sp=0xc000131b38 pc=0x55f9660ba935 Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/runner.Execute({0xc000034250?, 0x0?, 0x0?}) Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/runner/runner.go:22 +0xd4 fp=0xc000131d30 sp=0xc000131d08 pc=0x55f966145434 Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/cmd.NewCLI.func2(0xc00004f500?, {0x55f966c00081?, 0x4?, 0x55f966c00085?}) Sep 02 16:13:18 exhaust ollama[2205776]: github.com/ollama/ollama/cmd/cmd.go:1583 +0x45 fp=0xc000131d58 sp=0xc000131d30 pc=0x55f9668aa6e5 Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra.(*Command).execute(0xc0004fcf08, {0xc0004593c0, 0x4, 0x4}) Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra@v1.7.0/command.go:940 +0x85c fp=0xc000131e78 sp=0xc000131d58 pc=0x55f965db889c Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra.(*Command).ExecuteC(0xc0004ce908) Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5 fp=0xc000131f30 sp=0xc000131e78 pc=0x55f965db90e5 Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra.(*Command).Execute(...) Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra@v1.7.0/command.go:992 Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra.(*Command).ExecuteContext(...) Sep 02 16:13:18 exhaust ollama[2205776]: github.com/spf13/cobra@v1.7.0/command.go:985 Sep 02 16:13:18 exhaust ollama[2205776]: main.main() ``` Note my service: ``` [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="OLLAMA_DEBUG=1" Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0" [Install] WantedBy=default.target ``` ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.11.8
GiteaMirror added the bug label 2026-04-29 06:30:49 -05:00
Author
Owner

@Kraust commented on GitHub (Sep 2, 2025):

Note I am going through https://github.com/ollama/ollama/blob/main/docs/linux.md and manually reinstalling. I will try and go back to an older version and see if it resolves my issue.

<!-- gh-comment-id:3246682010 --> @Kraust commented on GitHub (Sep 2, 2025): Note I am going through https://github.com/ollama/ollama/blob/main/docs/linux.md and manually reinstalling. I will try and go back to an older version and see if it resolves my issue.
Author
Owner

@Kraust commented on GitHub (Sep 2, 2025):

Resolved in https://github.com/ollama/ollama/releases/tag/v0.11.9-rc0 👍

<!-- gh-comment-id:3246690934 --> @Kraust commented on GitHub (Sep 2, 2025): Resolved in https://github.com/ollama/ollama/releases/tag/v0.11.9-rc0 👍
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#54598