[GH-ISSUE #11123] Ollama crashes with Memory critical error: Reason: Memory in use. #7335

Open
opened 2026-04-12 19:23:05 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @brpaz on GitHub (Jun 18, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11123

What is the issue?

Hello.

I finally managed to have ollama using my AMD 780M GPU but now it´s completely broken.
every time I run some prompt, I got the following error:

Error: model runner has unexpectedly stopped, this may be due to resource limitations or an internal error, check ollama server logs for details

Checking the server logs, it seems to be a memory related issue:

Jun 18 22:16:41 ollama[116408]: ROCm error: invalid device function
Jun 18 22:16:41 ollama[116408]:   current device: 0, in function ggml_cuda_compute_forward at //ml/backend/ggml/ggml/src/ggml-cuda/ggml-cuda.cu:2362
Jun 18 22:16:41 ollama[116408]:   err
Jun 18 22:16:41 ollama[116408]: //ml/backend/ggml/ggml/src/ggml-cuda/ggml-cuda.cu:75: ROCm error
Jun 18 22:16:41 ollama[116408]: Memory critical error by agent node-0 (Agent handle: 0x561adbeb8a80) on address 0x7f401b600000. Reason: Memory in use.
Jun 18 22:16:41 ollama[116408]: SIGABRT: abort
Jun 18 22:16:41 ollama[116408]: PC=0x7f407936711c m=7 sigcode=18446744073709551610
Jun 18 22:16:41 ollama[116408]: signal arrived during cgo execution

I am thinking the memory calculation is not correct:

jun 18 22:37:35 bruno-laptop ollama[146230]: llama_model_load_from_file_impl: using device ROCm0 (AMD Radeon Graphics) - 23860 MiB free

I have only allocated 16GB to VRAM.

also where this 3GB comes from?

jun 18 22:20:58 bruno-laptop ollama[126301]: time=2025-06-18T22:20:58.717+01:00 level=DEBUG source=sched.go:361 msg="after processing request finished event" runner.name=registry.ollama.ai/library/llama3.2:latest runner.inference=rocm runner.devices=1 runner.size="3.7 GiB" runner.vram="3.7 GiB" runner.parallel=2 runner.pid=126516 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff runner.num_ctx=8192 refCount=0

Relevant log output

source=amd_linux.go:389 msg="skipping rocm gfx compatibility check" HSA_OVERRIDE_GFX_VERSION=11.0.3
jun 18 22:36:27 bruno-laptop ollama[146230]: time=2025-06-18T22:36:27.925+01:00 level=INFO source=types.go:130 msg="inference compute" id=0 library=rocm variant="" compute=gfx1103 driver=0.0 name=1002:1900 total="16.0 GiB" available="14.1 GiB"

Jun 18 22:16:40 ollama[116408]: time=2025-06-18T22:16:40.659+01:00 level=INFO source=server.go:630 msg="llama runner started in 2.26 seconds"
Jun 18 22:16:40 ollama[116408]: time=2025-06-18T22:16:40.659+01:00 level=DEBUG source=sched.go:495 msg="finished setting up" runner.name=registry.ollama.ai/library/llama3.2:latest runner.inference=rocm runner.devices=1 runner.size="3.7 GiB" runner.vram="3.7 GiB" runner.parallel=2 runner.pid=121180 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff runner.num_ctx=8192
Jun 18 22:16:40 ollama[116408]: [GIN] 2025/06/18 - 22:16:40 | 200 |  2.747136819s |       127.0.0.1 | POST     "/api/generate"
Jun 18 22:16:40 ollama[116408]: time=2025-06-18T22:16:40.659+01:00 level=DEBUG source=sched.go:503 msg="context for request finished"
Jun 18 22:16:40 ollama[116408]: time=2025-06-18T22:16:40.659+01:00 level=DEBUG source=sched.go:343 msg="runner with non-zero duration has gone idle, adding timer" runner.name=registry.ollama.ai/library/llama3.2:latest runner.inference=rocm runner.devices=1 runner.size="3.7 GiB" runner.vram="3.7 GiB" runner.parallel=2 runner.pid=121180 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff runner.num_ctx=8192 duration=5m0s
Jun 18 22:16:40 ollama[116408]: time=2025-06-18T22:16:40.659+01:00 level=DEBUG source=sched.go:361 msg="after processing request finished event" runner.name=registry.ollama.ai/library/llama3.2:latest runner.inference=rocm runner.devices=1 runner.size="3.7 GiB" runner.vram="3.7 GiB" runner.parallel=2 runner.pid=121180 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff runner.num_ctx=8192 refCount=0
Jun 18 22:16:41 ollama[116408]: time=2025-06-18T22:16:41.972+01:00 level=DEBUG source=ggml.go:155 msg="key not found" key=general.alignment default=32
Jun 18 22:16:41 ollama[116408]: time=2025-06-18T22:16:41.972+01:00 level=DEBUG source=sched.go:615 msg="evaluating already loaded" model=/usr/share/ollama/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff
Jun 18 22:16:41 ollama[116408]: time=2025-06-18T22:16:41.973+01:00 level=DEBUG source=server.go:729 msg="completion request" images=0 prompt=207 format=""
Jun 18 22:16:41 ollama[116408]: time=2025-06-18T22:16:41.973+01:00 level=DEBUG source=cache.go:104 msg="loading cache slot" id=0 cache=0 prompt=28 used=0 remaining=28
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :371 : 3934306342d us:   hipStreamSynchronize ( stream:0x7f3e650cf6a0 ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :372 : 3934306351d us:  hipStreamSynchronize: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :371 : 3934306353d us:   hipStreamSynchronize ( stream:0x7f3e650cf6a0 ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :372 : 3934306354d us:  hipStreamSynchronize: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :634 : 3934306356d us:   hipGetDevice ( 0x7f402bffe95c ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :642 : 3934306357d us:  hipGetDevice: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp           :1572: 3934306362d us:   hipMemcpyAsync ( 0x7f3d54c00000, 0x7f3e0d001800, 344064, hipMemcpyHostToDevice, stream:0x2 ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp           :1573: 3934306402d us:  hipMemcpyAsync: Returned hipSuccess : : duration: 40d us
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :371 : 3934306404d us:   hipStreamSynchronize ( stream:0x2 ) 
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp           :67  : 3934306406d us:  Host active wait for Signal = (0x7f40225ff700) for 10000 ns
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp           :484 : 3934306438d us:  Set Handler: handle(0x7f40225ff680), timestamp(0x7f3e64646280)
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp           :67  : 3934306441d us:  Host active wait for Signal = (0x7f40225ff680) for -1 ns
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :372 : 3934306487d us:  hipStreamSynchronize: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :371 : 3934306489d us:   hipStreamSynchronize ( stream:0x7f3e650cf6a0 ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :372 : 3934306490d us:  hipStreamSynchronize: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :634 : 3934306492d us:   hipGetDevice ( 0x7f402bffe95c ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :642 : 3934306494d us:  hipGetDevice: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp           :1572: 3934306495d us:   hipMemcpyAsync ( 0x7f3d55200000, 0x7f3e0c000800, 112, hipMemcpyHostToDevice, stream:0x2 ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp           :1573: 3934306501d us:  hipMemcpyAsync: Returned hipSuccess : : duration: 6d us
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :371 : 3934306504d us:   hipStreamSynchronize ( stream:0x2 ) 
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp           :67  : 3934306506d us:  Host active wait for Signal = (0x7f40225ff600) for 10000 ns
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp           :227 : 3934306500d us:  Handler: value(0), timestamp(0x7f401020bd60), handle(0x7f40225ff680)
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp           :484 : 3934306516d us:  Set Handler: handle(0x7f40225ff580), timestamp(0x7f3e645060f0)
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp           :67  : 3934306518d us:  Host active wait for Signal = (0x7f40225ff580) for -1 ns
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :372 : 3934306528d us:  hipStreamSynchronize: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :371 : 3934306529d us:   hipStreamSynchronize ( stream:0x7f3e650cf6a0 ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :372 : 3934306531d us:  hipStreamSynchronize: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :634 : 3934306532d us:   hipGetDevice ( 0x7f402bffe95c ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :642 : 3934306533d us:  hipGetDevice: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp           :1572: 3934306535d us:   hipMemcpyAsync ( 0x7f3d55200800, 0x7f3e0c001000, 8192, hipMemcpyHostToDevice, stream:0x2 ) 
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp           :227 : 3934306538d us:  Handler: value(0), timestamp(0x7f3e64f893f0), handle(0x7f40225ff580)
Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp           :1573: 3934306539d us:  hipMemcpyAsync: Returned hipSuccess : : duration: 4d us
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :371 : 3934306542d us:   hipStreamSynchronize ( stream:0x2 ) 
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp           :67  : 3934306544d us:  Host active wait for Signal = (0x7f40225ff500) for 10000 ns
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp           :484 : 3934306554d us:  Set Handler: handle(0x7f40225ff480), timestamp(0x7f3e649259f0)
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp           :67  : 3934306556d us:  Host active wait for Signal = (0x7f40225ff480) for -1 ns
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :372 : 3934306566d us:  hipStreamSynchronize: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :371 : 3934306568d us:   hipStreamSynchronize ( stream:0x7f3e650cf6a0 ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :372 : 3934306569d us:  hipStreamSynchronize: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :634 : 3934306571d us:   hipGetDevice ( 0x7f402bffe95c ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :642 : 3934306572d us:  hipGetDevice: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp           :1572: 3934306573d us:   hipMemcpyAsync ( 0x7f3d56200800, 0x7f3e0d001000, 4, hipMemcpyHostToDevice, stream:0x2 ) 
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp           :227 : 3934306576d us:  Handler: value(0), timestamp(0x7f3e650bc940), handle(0x7f40225ff480)
Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp           :1573: 3934306577d us:  hipMemcpyAsync: Returned hipSuccess : : duration: 4d us
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :371 : 3934306587d us:   hipStreamSynchronize ( stream:0x2 ) 
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp           :67  : 3934306589d us:  Host active wait for Signal = (0x7f40225ff400) for 10000 ns
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp           :484 : 3934306592d us:  Set Handler: handle(0x7f40225ff380), timestamp(0x7f3e64fa2610)
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp           :67  : 3934306594d us:  Host active wait for Signal = (0x7f40225ff380) for -1 ns
Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp           :372 : 3934306604d us:  hipStreamSynchronize: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :634 : 3934306606d us:   hipGetDevice ( 0x7f402bffe3dc ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :642 : 3934306607d us:  hipGetDevice: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_platform.cpp         :234 : 3934306612d us:   __hipPushCallConfiguration ( {28,1,1}, {1024,1,1}, 0, stream:0x7f3e650cf6a0 ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_platform.cpp         :238 : 3934306617d us:  __hipPushCallConfiguration: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp           :227 : 3934306613d us:  Handler: value(0), timestamp(0x7f3e6470a150), handle(0x7f40225ff380)
Jun 18 22:16:41 ollama[116408]: :3:hip_platform.cpp         :243 : 3934306619d us:   __hipPopCallConfiguration ( {10,0,1701999731}, {962946662,25397,738190080}, 0x7f402bffe300, 0x7f402bffe2f8 ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_platform.cpp         :252 : 3934306621d us:  __hipPopCallConfiguration: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: :3:hip_module.cpp           :685 : 3934306625d us:   hipLaunchKernel ( 0x7f3fe018ae28, {28,1,1}, {1024,1,1}, 0x7f402bffe370, 0, stream:0x7f3e650cf6a0 ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_module.cpp           :686 : 3934306628d us:  hipLaunchKernel: Returned hipErrorInvalidDeviceFunction : : duration: 3d us
Jun 18 22:16:41 ollama[116408]: :3:hip_error.cpp            :36  : 3934306630d us:   hipGetLastError (  ) 
Jun 18 22:16:41 ollama[116408]: ggml_cuda_compute_forward: RMS_NORM failed
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :634 : 3934306633d us:   hipGetDevice ( 0x7f402bffe3bc ) 
Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp   :642 : 3934306634d us:  hipGetDevice: Returned hipSuccess :
Jun 18 22:16:41 ollama[116408]: ROCm error: invalid device function
Jun 18 22:16:41 ollama[116408]:   current device: 0, in function ggml_cuda_compute_forward at //ml/backend/ggml/ggml/src/ggml-cuda/ggml-cuda.cu:2362
Jun 18 22:16:41 ollama[116408]:   err
Jun 18 22:16:41 ollama[116408]: //ml/backend/ggml/ggml/src/ggml-cuda/ggml-cuda.cu:75: ROCm error
Jun 18 22:16:41 ollama[116408]: Memory critical error by agent node-0 (Agent handle: 0x561adbeb8a80) on address 0x7f401b600000. Reason: Memory in use.
Jun 18 22:16:41 ollama[116408]: SIGABRT: abort
Jun 18 22:16:41 ollama[116408]: PC=0x7f407936711c m=7 sigcode=18446744073709551610
Jun 18 22:16:41 ollama[116408]: signal arrived during cgo execution
Jun 18 22:16:41 ollama[116408]: goroutine 11 gp=0xc000505c00 m=7 mp=0xc00050e008 [syscall]:
Jun 18 22:16:41 ollama[116408]: runtime.cgocall(0x561ada4c8a10, 0xc000093bd8)
Jun 18 22:16:41 ollama[116408]: 	runtime/cgocall.go:167 +0x4b fp=0xc000093bb0 sp=0xc000093b78 pc=0x561ad981cecb
Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/llama._Cfunc_llama_decode(0x7f3e64f8acc0, {0x1c, 0x7f3e647043e0, 0x0, 0x7f3e64f89940, 0x7f3e6454d2c0, 0x7f3e64faa310, 0x7f3e6498c600})
Jun 18 22:16:41 ollama[116408]: 	_cgo_gotypes.go:605 +0x4a fp=0xc000093bd8 sp=0xc000093bb0 pc=0x561ad9bca9aa
Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/llama.(*Context).Decode.func1(...)
Jun 18 22:16:41 ollama[116408]: 	github.com/ollama/ollama/llama/llama.go:133
Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/llama.(*Context).Decode(0xc000508588?, 0x1?)
Jun 18 22:16:41 ollama[116408]: 	github.com/ollama/ollama/llama/llama.go:133 +0xed fp=0xc000093cc0 sp=0xc000093bd8 pc=0x561ad9bccbed
Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner.(*Server).processBatch(0xc0004c4360, 0xc00052a050, 0xc000508728)
Jun 18 22:16:41 ollama[116408]: 	github.com/ollama/ollama/runner/llamarunner/runner.go:436 +0x209 fp=0xc000093ee8 sp=0xc000093cc0 pc=0x561ad9c84da9
Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner.(*Server).run(0xc0004c4360, {0x561adab5fd90, 0xc0000fda40})
Jun 18 22:16:41 ollama[116408]: 	github.com/ollama/ollama/runner/llamarunner/runner.go:341 +0x1bb fp=0xc000093fb8 sp=0xc000093ee8 pc=0x561ad9c84a1b
Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner.Execute.gowrap2()
Jun 18 22:16:41 ollama[116408]: 	github.com/ollama/ollama/runner/llamarunner/runner.go:855 +0x28 fp=0xc000093fe0 sp=0xc000093fb8 pc=0x561ad9c891e8
Jun 18 22:16:41 ollama[116408]: runtime.goexit({})
Jun 18 22:16:41 ollama[116408]: 	runtime/asm_amd64.s:1700 +0x1 fp=0xc000093fe8 sp=0xc000093fe0 pc=0x561ad9827901
Jun 18 22:16:41 ollama[116408]: created by github.com/ollama/ollama/runner/llamarunner.Execute in goroutine 1
Jun 18 22:16:41 ollama[116408]: 	github.com/ollama/ollama/runner/llamarunner/runner.go:855 +0xc37
Jun 18 22:16:41 ollama[116408]: goroutine 1 gp=0xc000002380 m=nil [IO wait]:
Jun 18 22:16:41 ollama[116408]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
Jun 18 22:16:41 ollama[116408]: 	runtime/proc.go:435 +0xce fp=0xc000127608 sp=0xc0001275e8 pc=0x561ad98201ce
Jun 18 22:16:41 ollama[116408]: runtime.netpollblock(0xc000127658?, 0xd97b9ae6?, 0x1a?)
Jun 18 22:16:41 ollama[116408]: 	runtime/netpoll.go:575 +0xf7 fp=0xc000127640 sp=0xc000127608 pc=0x561ad97e4fb7
Jun 18 22:16:41 ollama[116408]: internal/poll.runtime_pollWait(0x7f407904beb0, 0x72)
Jun 18 22:16:41 ollama[116408]: 	runtime/netpoll.go:351 +0x85 fp=0xc000127660 sp=0xc000127640 pc=0x561ad981f3e5
Jun 18 22:16:41 ollama[116408]: internal/poll.(*pollDesc).wait(0xc0004ab480?, 0x900000036?, 0x0)
Jun 18 22:16:41 ollama[116408]: 	internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc000127688 sp=0xc000127660 pc=0x561ad98a6827
Jun 18 22:16:41 ollama[116408]: internal/poll.(*pollDesc).waitRead(...)
Jun 18 22:16:41 ollama[116408]: 	internal/poll/fd_poll_runtime.go:89
Jun 18 22:16:41 ollama[116408]: internal/poll.(*FD).Accept(0xc0004ab480)
Jun 18 22:16:41 ollama[116408]: 	internal/poll/fd_unix.go:620 +0x295 fp=0xc000127730 sp=0xc000127688 pc=0x561ad98abbf5
Jun 18 22:16:41 ollama[116408]: net.(*netFD).accept(0xc0004ab480)
Jun 18 22:16:41 ollama[116408]: 	net/fd_unix.go:172 +0x29 fp=0xc0001277e8 sp=0xc000127730 pc=0x561ad991e109
Jun 18 22:16:41 ollama[116408]: net.(*TCPListener).accept(0xc000140fc0)
Jun 18 22:16:41 ollama[116408]: 	net/tcpsock_posix.go:159 +0x1b fp=0xc000127838 sp=0xc0001277e8 pc=0x561ad9933abb
Jun 18 22:16:41 ollama[116408]: net.(*TCPListener).Accept(0xc000140fc0)
Jun 18 22:16:41 ollama[116408]: 	net/tcpsock.go:380 +0x30 fp=0xc000127868 sp=0xc000127838 pc=0x561ad9932970
Jun 18 22:16:41 ollama[116408]: net/http.(*onceCloseListener).Accept(0xc0004c4480?)
Jun 18 22:16:41 ollama[116408]: 	<autogenerated>:1 +0x24 fp=0xc000127880 sp=0xc000127868 pc=0x561ad9b4a0c4
Jun 18 22:16:41 ollama[116408]: net/http.(*Server).Serve(0xc000035900, {0x561adab5d938, 0xc000140fc0})
Jun 18 22:16:41 ollama[116408]: 	net/http/server.go:3424 +0x30c fp=0xc0001279b0 sp=0xc000127880 pc=0x561ad9b2198c
Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner.Execute({0xc000034120, 0xe, 0xe})
Jun 18 22:16:41 ollama[116408]: 	github.com/ollama/ollama/runner/llamarunner/runner.go:875 +0x100a fp=0xc000127d08 sp=0xc0001279b0 pc=0x561ad9c88f2a
Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner.Execute({0xc000034110?, 0x0?, 0x0?})
Jun 18 22:16:41 ollama[116408]: 	github.com/ollama/ollama/runner/runner.go:22 +0xd4 fp=0xc000127d30 sp=0xc000127d08 pc=0x561ad9d07b34
Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/cmd.NewCLI.func2(0xc000035600?, {0x561ada6b106e?, 0x4?, 0x561ada6b1072?})
Jun 18 22:16:41 ollama[116408]: 	github.com/ollama/ollama/cmd/cmd.go:1529 +0x45 fp=0xc000127d58 sp=0xc000127d30 pc=0x561ada456f65
Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra.(*Command).execute(0xc0004c8f08, {0xc000000700, 0xe, 0xe})
Jun 18 22:16:41 ollama[116408]: 	github.com/spf13/cobra@v1.7.0/command.go:940 +0x85c fp=0xc000127e78 sp=0xc000127d58 pc=0x561ad999775c
Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra.(*Command).ExecuteC(0xc000149508)
Jun 18 22:16:41 ollama[116408]: 	github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5 fp=0xc000127f30 sp=0xc000127e78 pc=0x561ad9997fa5
Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra.(*Command).Execute(...)
Jun 18 22:16:41 ollama[116408]: 	github.com/spf13/cobra@v1.7.0/command.go:992
Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra.(*Command).ExecuteContext(...)
Jun 18 22:16:41 ollama[116408]: 	github.com/spf13/cobra@v1.7.0/command.go:985
Jun 18 22:16:41 ollama[116408]: main.main()

OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.9.2

Originally created by @brpaz on GitHub (Jun 18, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11123 ### What is the issue? Hello. I finally managed to have ollama using my AMD 780M GPU but now it´s completely broken. every time I run some prompt, I got the following error: ``` Error: model runner has unexpectedly stopped, this may be due to resource limitations or an internal error, check ollama server logs for details ``` Checking the server logs, it seems to be a memory related issue: ``` shell Jun 18 22:16:41 ollama[116408]: ROCm error: invalid device function Jun 18 22:16:41 ollama[116408]: current device: 0, in function ggml_cuda_compute_forward at //ml/backend/ggml/ggml/src/ggml-cuda/ggml-cuda.cu:2362 Jun 18 22:16:41 ollama[116408]: err Jun 18 22:16:41 ollama[116408]: //ml/backend/ggml/ggml/src/ggml-cuda/ggml-cuda.cu:75: ROCm error Jun 18 22:16:41 ollama[116408]: Memory critical error by agent node-0 (Agent handle: 0x561adbeb8a80) on address 0x7f401b600000. Reason: Memory in use. Jun 18 22:16:41 ollama[116408]: SIGABRT: abort Jun 18 22:16:41 ollama[116408]: PC=0x7f407936711c m=7 sigcode=18446744073709551610 Jun 18 22:16:41 ollama[116408]: signal arrived during cgo execution ``` I am thinking the memory calculation is not correct: ``` jun 18 22:37:35 bruno-laptop ollama[146230]: llama_model_load_from_file_impl: using device ROCm0 (AMD Radeon Graphics) - 23860 MiB free ```` I have only allocated 16GB to VRAM. also where this 3GB comes from? ``` jun 18 22:20:58 bruno-laptop ollama[126301]: time=2025-06-18T22:20:58.717+01:00 level=DEBUG source=sched.go:361 msg="after processing request finished event" runner.name=registry.ollama.ai/library/llama3.2:latest runner.inference=rocm runner.devices=1 runner.size="3.7 GiB" runner.vram="3.7 GiB" runner.parallel=2 runner.pid=126516 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff runner.num_ctx=8192 refCount=0 ``` ### Relevant log output ```shell source=amd_linux.go:389 msg="skipping rocm gfx compatibility check" HSA_OVERRIDE_GFX_VERSION=11.0.3 jun 18 22:36:27 bruno-laptop ollama[146230]: time=2025-06-18T22:36:27.925+01:00 level=INFO source=types.go:130 msg="inference compute" id=0 library=rocm variant="" compute=gfx1103 driver=0.0 name=1002:1900 total="16.0 GiB" available="14.1 GiB" Jun 18 22:16:40 ollama[116408]: time=2025-06-18T22:16:40.659+01:00 level=INFO source=server.go:630 msg="llama runner started in 2.26 seconds" Jun 18 22:16:40 ollama[116408]: time=2025-06-18T22:16:40.659+01:00 level=DEBUG source=sched.go:495 msg="finished setting up" runner.name=registry.ollama.ai/library/llama3.2:latest runner.inference=rocm runner.devices=1 runner.size="3.7 GiB" runner.vram="3.7 GiB" runner.parallel=2 runner.pid=121180 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff runner.num_ctx=8192 Jun 18 22:16:40 ollama[116408]: [GIN] 2025/06/18 - 22:16:40 | 200 | 2.747136819s | 127.0.0.1 | POST "/api/generate" Jun 18 22:16:40 ollama[116408]: time=2025-06-18T22:16:40.659+01:00 level=DEBUG source=sched.go:503 msg="context for request finished" Jun 18 22:16:40 ollama[116408]: time=2025-06-18T22:16:40.659+01:00 level=DEBUG source=sched.go:343 msg="runner with non-zero duration has gone idle, adding timer" runner.name=registry.ollama.ai/library/llama3.2:latest runner.inference=rocm runner.devices=1 runner.size="3.7 GiB" runner.vram="3.7 GiB" runner.parallel=2 runner.pid=121180 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff runner.num_ctx=8192 duration=5m0s Jun 18 22:16:40 ollama[116408]: time=2025-06-18T22:16:40.659+01:00 level=DEBUG source=sched.go:361 msg="after processing request finished event" runner.name=registry.ollama.ai/library/llama3.2:latest runner.inference=rocm runner.devices=1 runner.size="3.7 GiB" runner.vram="3.7 GiB" runner.parallel=2 runner.pid=121180 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff runner.num_ctx=8192 refCount=0 Jun 18 22:16:41 ollama[116408]: time=2025-06-18T22:16:41.972+01:00 level=DEBUG source=ggml.go:155 msg="key not found" key=general.alignment default=32 Jun 18 22:16:41 ollama[116408]: time=2025-06-18T22:16:41.972+01:00 level=DEBUG source=sched.go:615 msg="evaluating already loaded" model=/usr/share/ollama/.ollama/models/blobs/sha256-dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff Jun 18 22:16:41 ollama[116408]: time=2025-06-18T22:16:41.973+01:00 level=DEBUG source=server.go:729 msg="completion request" images=0 prompt=207 format="" Jun 18 22:16:41 ollama[116408]: time=2025-06-18T22:16:41.973+01:00 level=DEBUG source=cache.go:104 msg="loading cache slot" id=0 cache=0 prompt=28 used=0 remaining=28 Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :371 : 3934306342d us:  hipStreamSynchronize ( stream:0x7f3e650cf6a0 )  Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :372 : 3934306351d us: hipStreamSynchronize: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :371 : 3934306353d us:  hipStreamSynchronize ( stream:0x7f3e650cf6a0 )  Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :372 : 3934306354d us: hipStreamSynchronize: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :634 : 3934306356d us:  hipGetDevice ( 0x7f402bffe95c )  Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :642 : 3934306357d us: hipGetDevice: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp :1572: 3934306362d us:  hipMemcpyAsync ( 0x7f3d54c00000, 0x7f3e0d001800, 344064, hipMemcpyHostToDevice, stream:0x2 )  Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp :1573: 3934306402d us: hipMemcpyAsync: Returned hipSuccess : : duration: 40d us Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :371 : 3934306404d us:  hipStreamSynchronize ( stream:0x2 )  Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp :67 : 3934306406d us: Host active wait for Signal = (0x7f40225ff700) for 10000 ns Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp :484 : 3934306438d us: Set Handler: handle(0x7f40225ff680), timestamp(0x7f3e64646280) Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp :67 : 3934306441d us: Host active wait for Signal = (0x7f40225ff680) for -1 ns Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :372 : 3934306487d us: hipStreamSynchronize: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :371 : 3934306489d us:  hipStreamSynchronize ( stream:0x7f3e650cf6a0 )  Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :372 : 3934306490d us: hipStreamSynchronize: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :634 : 3934306492d us:  hipGetDevice ( 0x7f402bffe95c )  Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :642 : 3934306494d us: hipGetDevice: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp :1572: 3934306495d us:  hipMemcpyAsync ( 0x7f3d55200000, 0x7f3e0c000800, 112, hipMemcpyHostToDevice, stream:0x2 )  Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp :1573: 3934306501d us: hipMemcpyAsync: Returned hipSuccess : : duration: 6d us Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :371 : 3934306504d us:  hipStreamSynchronize ( stream:0x2 )  Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp :67 : 3934306506d us: Host active wait for Signal = (0x7f40225ff600) for 10000 ns Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp :227 : 3934306500d us: Handler: value(0), timestamp(0x7f401020bd60), handle(0x7f40225ff680) Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp :484 : 3934306516d us: Set Handler: handle(0x7f40225ff580), timestamp(0x7f3e645060f0) Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp :67 : 3934306518d us: Host active wait for Signal = (0x7f40225ff580) for -1 ns Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :372 : 3934306528d us: hipStreamSynchronize: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :371 : 3934306529d us:  hipStreamSynchronize ( stream:0x7f3e650cf6a0 )  Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :372 : 3934306531d us: hipStreamSynchronize: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :634 : 3934306532d us:  hipGetDevice ( 0x7f402bffe95c )  Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :642 : 3934306533d us: hipGetDevice: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp :1572: 3934306535d us:  hipMemcpyAsync ( 0x7f3d55200800, 0x7f3e0c001000, 8192, hipMemcpyHostToDevice, stream:0x2 )  Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp :227 : 3934306538d us: Handler: value(0), timestamp(0x7f3e64f893f0), handle(0x7f40225ff580) Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp :1573: 3934306539d us: hipMemcpyAsync: Returned hipSuccess : : duration: 4d us Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :371 : 3934306542d us:  hipStreamSynchronize ( stream:0x2 )  Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp :67 : 3934306544d us: Host active wait for Signal = (0x7f40225ff500) for 10000 ns Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp :484 : 3934306554d us: Set Handler: handle(0x7f40225ff480), timestamp(0x7f3e649259f0) Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp :67 : 3934306556d us: Host active wait for Signal = (0x7f40225ff480) for -1 ns Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :372 : 3934306566d us: hipStreamSynchronize: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :371 : 3934306568d us:  hipStreamSynchronize ( stream:0x7f3e650cf6a0 )  Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :372 : 3934306569d us: hipStreamSynchronize: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :634 : 3934306571d us:  hipGetDevice ( 0x7f402bffe95c )  Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :642 : 3934306572d us: hipGetDevice: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp :1572: 3934306573d us:  hipMemcpyAsync ( 0x7f3d56200800, 0x7f3e0d001000, 4, hipMemcpyHostToDevice, stream:0x2 )  Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp :227 : 3934306576d us: Handler: value(0), timestamp(0x7f3e650bc940), handle(0x7f40225ff480) Jun 18 22:16:41 ollama[116408]: :3:hip_memory.cpp :1573: 3934306577d us: hipMemcpyAsync: Returned hipSuccess : : duration: 4d us Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :371 : 3934306587d us:  hipStreamSynchronize ( stream:0x2 )  Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp :67 : 3934306589d us: Host active wait for Signal = (0x7f40225ff400) for 10000 ns Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp :484 : 3934306592d us: Set Handler: handle(0x7f40225ff380), timestamp(0x7f3e64fa2610) Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.hpp :67 : 3934306594d us: Host active wait for Signal = (0x7f40225ff380) for -1 ns Jun 18 22:16:41 ollama[116408]: :3:hip_stream.cpp :372 : 3934306604d us: hipStreamSynchronize: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :634 : 3934306606d us:  hipGetDevice ( 0x7f402bffe3dc )  Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :642 : 3934306607d us: hipGetDevice: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_platform.cpp :234 : 3934306612d us:  __hipPushCallConfiguration ( {28,1,1}, {1024,1,1}, 0, stream:0x7f3e650cf6a0 )  Jun 18 22:16:41 ollama[116408]: :3:hip_platform.cpp :238 : 3934306617d us: __hipPushCallConfiguration: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:rocvirtual.cpp :227 : 3934306613d us: Handler: value(0), timestamp(0x7f3e6470a150), handle(0x7f40225ff380) Jun 18 22:16:41 ollama[116408]: :3:hip_platform.cpp :243 : 3934306619d us:  __hipPopCallConfiguration ( {10,0,1701999731}, {962946662,25397,738190080}, 0x7f402bffe300, 0x7f402bffe2f8 )  Jun 18 22:16:41 ollama[116408]: :3:hip_platform.cpp :252 : 3934306621d us: __hipPopCallConfiguration: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: :3:hip_module.cpp :685 : 3934306625d us:  hipLaunchKernel ( 0x7f3fe018ae28, {28,1,1}, {1024,1,1}, 0x7f402bffe370, 0, stream:0x7f3e650cf6a0 )  Jun 18 22:16:41 ollama[116408]: :3:hip_module.cpp :686 : 3934306628d us: hipLaunchKernel: Returned hipErrorInvalidDeviceFunction : : duration: 3d us Jun 18 22:16:41 ollama[116408]: :3:hip_error.cpp :36 : 3934306630d us:  hipGetLastError ( )  Jun 18 22:16:41 ollama[116408]: ggml_cuda_compute_forward: RMS_NORM failed Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :634 : 3934306633d us:  hipGetDevice ( 0x7f402bffe3bc )  Jun 18 22:16:41 ollama[116408]: :3:hip_device_runtime.cpp :642 : 3934306634d us: hipGetDevice: Returned hipSuccess : Jun 18 22:16:41 ollama[116408]: ROCm error: invalid device function Jun 18 22:16:41 ollama[116408]: current device: 0, in function ggml_cuda_compute_forward at //ml/backend/ggml/ggml/src/ggml-cuda/ggml-cuda.cu:2362 Jun 18 22:16:41 ollama[116408]: err Jun 18 22:16:41 ollama[116408]: //ml/backend/ggml/ggml/src/ggml-cuda/ggml-cuda.cu:75: ROCm error Jun 18 22:16:41 ollama[116408]: Memory critical error by agent node-0 (Agent handle: 0x561adbeb8a80) on address 0x7f401b600000. Reason: Memory in use. Jun 18 22:16:41 ollama[116408]: SIGABRT: abort Jun 18 22:16:41 ollama[116408]: PC=0x7f407936711c m=7 sigcode=18446744073709551610 Jun 18 22:16:41 ollama[116408]: signal arrived during cgo execution Jun 18 22:16:41 ollama[116408]: goroutine 11 gp=0xc000505c00 m=7 mp=0xc00050e008 [syscall]: Jun 18 22:16:41 ollama[116408]: runtime.cgocall(0x561ada4c8a10, 0xc000093bd8) Jun 18 22:16:41 ollama[116408]: runtime/cgocall.go:167 +0x4b fp=0xc000093bb0 sp=0xc000093b78 pc=0x561ad981cecb Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/llama._Cfunc_llama_decode(0x7f3e64f8acc0, {0x1c, 0x7f3e647043e0, 0x0, 0x7f3e64f89940, 0x7f3e6454d2c0, 0x7f3e64faa310, 0x7f3e6498c600}) Jun 18 22:16:41 ollama[116408]: _cgo_gotypes.go:605 +0x4a fp=0xc000093bd8 sp=0xc000093bb0 pc=0x561ad9bca9aa Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/llama.(*Context).Decode.func1(...) Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/llama/llama.go:133 Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/llama.(*Context).Decode(0xc000508588?, 0x1?) Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/llama/llama.go:133 +0xed fp=0xc000093cc0 sp=0xc000093bd8 pc=0x561ad9bccbed Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner.(*Server).processBatch(0xc0004c4360, 0xc00052a050, 0xc000508728) Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner/runner.go:436 +0x209 fp=0xc000093ee8 sp=0xc000093cc0 pc=0x561ad9c84da9 Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner.(*Server).run(0xc0004c4360, {0x561adab5fd90, 0xc0000fda40}) Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner/runner.go:341 +0x1bb fp=0xc000093fb8 sp=0xc000093ee8 pc=0x561ad9c84a1b Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner.Execute.gowrap2() Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner/runner.go:855 +0x28 fp=0xc000093fe0 sp=0xc000093fb8 pc=0x561ad9c891e8 Jun 18 22:16:41 ollama[116408]: runtime.goexit({}) Jun 18 22:16:41 ollama[116408]: runtime/asm_amd64.s:1700 +0x1 fp=0xc000093fe8 sp=0xc000093fe0 pc=0x561ad9827901 Jun 18 22:16:41 ollama[116408]: created by github.com/ollama/ollama/runner/llamarunner.Execute in goroutine 1 Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner/runner.go:855 +0xc37 Jun 18 22:16:41 ollama[116408]: goroutine 1 gp=0xc000002380 m=nil [IO wait]: Jun 18 22:16:41 ollama[116408]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) Jun 18 22:16:41 ollama[116408]: runtime/proc.go:435 +0xce fp=0xc000127608 sp=0xc0001275e8 pc=0x561ad98201ce Jun 18 22:16:41 ollama[116408]: runtime.netpollblock(0xc000127658?, 0xd97b9ae6?, 0x1a?) Jun 18 22:16:41 ollama[116408]: runtime/netpoll.go:575 +0xf7 fp=0xc000127640 sp=0xc000127608 pc=0x561ad97e4fb7 Jun 18 22:16:41 ollama[116408]: internal/poll.runtime_pollWait(0x7f407904beb0, 0x72) Jun 18 22:16:41 ollama[116408]: runtime/netpoll.go:351 +0x85 fp=0xc000127660 sp=0xc000127640 pc=0x561ad981f3e5 Jun 18 22:16:41 ollama[116408]: internal/poll.(*pollDesc).wait(0xc0004ab480?, 0x900000036?, 0x0) Jun 18 22:16:41 ollama[116408]: internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc000127688 sp=0xc000127660 pc=0x561ad98a6827 Jun 18 22:16:41 ollama[116408]: internal/poll.(*pollDesc).waitRead(...) Jun 18 22:16:41 ollama[116408]: internal/poll/fd_poll_runtime.go:89 Jun 18 22:16:41 ollama[116408]: internal/poll.(*FD).Accept(0xc0004ab480) Jun 18 22:16:41 ollama[116408]: internal/poll/fd_unix.go:620 +0x295 fp=0xc000127730 sp=0xc000127688 pc=0x561ad98abbf5 Jun 18 22:16:41 ollama[116408]: net.(*netFD).accept(0xc0004ab480) Jun 18 22:16:41 ollama[116408]: net/fd_unix.go:172 +0x29 fp=0xc0001277e8 sp=0xc000127730 pc=0x561ad991e109 Jun 18 22:16:41 ollama[116408]: net.(*TCPListener).accept(0xc000140fc0) Jun 18 22:16:41 ollama[116408]: net/tcpsock_posix.go:159 +0x1b fp=0xc000127838 sp=0xc0001277e8 pc=0x561ad9933abb Jun 18 22:16:41 ollama[116408]: net.(*TCPListener).Accept(0xc000140fc0) Jun 18 22:16:41 ollama[116408]: net/tcpsock.go:380 +0x30 fp=0xc000127868 sp=0xc000127838 pc=0x561ad9932970 Jun 18 22:16:41 ollama[116408]: net/http.(*onceCloseListener).Accept(0xc0004c4480?) Jun 18 22:16:41 ollama[116408]: <autogenerated>:1 +0x24 fp=0xc000127880 sp=0xc000127868 pc=0x561ad9b4a0c4 Jun 18 22:16:41 ollama[116408]: net/http.(*Server).Serve(0xc000035900, {0x561adab5d938, 0xc000140fc0}) Jun 18 22:16:41 ollama[116408]: net/http/server.go:3424 +0x30c fp=0xc0001279b0 sp=0xc000127880 pc=0x561ad9b2198c Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner.Execute({0xc000034120, 0xe, 0xe}) Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/llamarunner/runner.go:875 +0x100a fp=0xc000127d08 sp=0xc0001279b0 pc=0x561ad9c88f2a Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner.Execute({0xc000034110?, 0x0?, 0x0?}) Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/runner/runner.go:22 +0xd4 fp=0xc000127d30 sp=0xc000127d08 pc=0x561ad9d07b34 Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/cmd.NewCLI.func2(0xc000035600?, {0x561ada6b106e?, 0x4?, 0x561ada6b1072?}) Jun 18 22:16:41 ollama[116408]: github.com/ollama/ollama/cmd/cmd.go:1529 +0x45 fp=0xc000127d58 sp=0xc000127d30 pc=0x561ada456f65 Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra.(*Command).execute(0xc0004c8f08, {0xc000000700, 0xe, 0xe}) Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra@v1.7.0/command.go:940 +0x85c fp=0xc000127e78 sp=0xc000127d58 pc=0x561ad999775c Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra.(*Command).ExecuteC(0xc000149508) Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5 fp=0xc000127f30 sp=0xc000127e78 pc=0x561ad9997fa5 Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra.(*Command).Execute(...) Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra@v1.7.0/command.go:992 Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra.(*Command).ExecuteContext(...) Jun 18 22:16:41 ollama[116408]: github.com/spf13/cobra@v1.7.0/command.go:985 Jun 18 22:16:41 ollama[116408]: main.main() ``` ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.9.2
GiteaMirror added the bug label 2026-04-12 19:23:05 -05:00
Author
Owner

@fluchtkapsel commented on GitHub (Dec 1, 2025):

I, too, get this error

Memory critical error by agent node-0 (Agent handle: 0x5613fe057580) on address 0x7f9034f7c000. Reason: Memory in use. 

but not with ollama but with clinfo and with the PyTorch mnist examples. But only when running in a podman container derived from https://github.com/Toxantron/iGPU-Docker/tree/main/ROCm

The guide assumes ROCm 6.4.1, and I adapted it to 7.1.1. In parallel I followed the steps in a Distrobox container on Bazzite. There, I do not get this error with clinfo or the mnist examples. @brpaz didn't mention they how they set up everything but in my case it seems to stem from some differences in the containerization details. Maybe they used docker or something similar, too?

<!-- gh-comment-id:3597068232 --> @fluchtkapsel commented on GitHub (Dec 1, 2025): I, too, get this error ``` Memory critical error by agent node-0 (Agent handle: 0x5613fe057580) on address 0x7f9034f7c000. Reason: Memory in use. ``` but not with `ollama` but with `clinfo` and with the PyTorch mnist examples. But only when running in a podman container derived from https://github.com/Toxantron/iGPU-Docker/tree/main/ROCm The guide assumes ROCm 6.4.1, and I adapted it to 7.1.1. In parallel I followed the steps in a Distrobox container on Bazzite. There, I do not get this error with clinfo or the mnist examples. @brpaz didn't mention they how they set up everything but in my case it seems to stem from some differences in the containerization details. Maybe they used docker or something similar, too?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7335