[GH-ISSUE #14757] Ollama is crashing on Ubuntu 25.10 for Claude Code tasks #35299

Closed
opened 2026-04-22 19:42:28 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @alex-ramanau on GitHub (Mar 10, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14757

What is the issue?

When I run Claude Code using qwen3-coder-next on Ollama, I'm getting periodical crashes of Ollama servce(+ (core dumps).
It's happening after some time of GPU load, may happen after 10 minutes or 1 hour of heavy load. I'm trying mostly coding tasks, e.g. adding Prometheus metrics into small golang web service.

ollama.log

Relevant log output

OS details:

$ lsb_release -a                                 
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 25.10
Release:        25.10
Codename:       questing



CPU

╰─$ cat /proc/cpuinfo|grep AMD |grep model|head -n1
model name      : AMD RYZEN AI MAX+ 395 w/ Radeon 8060S


Claude Code config:

export ANTHROPIC_AUTH_TOKEN=ollama
export ANTHROPIC_API_KEY=""
export ANTHROPIC_BASE_URL=http://10.0.2.2:11434
export OLLAMA_MODEL=qwen3-coder-next
claude  --model qwen3-coder-next

OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.17.0

Originally created by @alex-ramanau on GitHub (Mar 10, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14757 ### What is the issue? When I run Claude Code using `qwen3-coder-next` on Ollama, I'm getting periodical crashes of Ollama servce(+ (core dumps). It's happening after some time of GPU load, may happen after 10 minutes or 1 hour of heavy load. I'm trying mostly coding tasks, e.g. adding Prometheus metrics into small golang web service. [ollama.log](https://github.com/user-attachments/files/25861752/ollama.log) ### Relevant log output ```shell OS details: $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 25.10 Release: 25.10 Codename: questing CPU ╰─$ cat /proc/cpuinfo|grep AMD |grep model|head -n1 model name : AMD RYZEN AI MAX+ 395 w/ Radeon 8060S Claude Code config: export ANTHROPIC_AUTH_TOKEN=ollama export ANTHROPIC_API_KEY="" export ANTHROPIC_BASE_URL=http://10.0.2.2:11434 export OLLAMA_MODEL=qwen3-coder-next claude --model qwen3-coder-next ``` ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.17.0
GiteaMirror added the bug label 2026-04-22 19:42:28 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 10, 2026):

сак 10 06:58:50 eniac-X2 ollama[75163]: HW Exception by GPU node-1 (Agent handle: 0x7dfe546f0130) reason :GPU Hang

This is a problem with the amdgpu kernel driver, it's been reported a few times and choosing a different kernel seems to help a bit, but it's not clear which kernel is best. Try switching to using Vulkan and see if the problem persists.

<!-- gh-comment-id:4030930289 --> @rick-github commented on GitHub (Mar 10, 2026): ``` сак 10 06:58:50 eniac-X2 ollama[75163]: HW Exception by GPU node-1 (Agent handle: 0x7dfe546f0130) reason :GPU Hang ``` This is a problem with the amdgpu kernel driver, it's been reported a few times and choosing a different kernel seems to help a bit, but it's not clear which kernel is best. Try switching to using [Vulkan](https://docs.ollama.com/gpu#vulkan-gpu-support) and see if the problem persists.
Author
Owner

@chejh-amd commented on GitHub (Mar 18, 2026):

A few additional things that might help narrow this down:

  1. Kernel version differences matter a lot on Ryzen‑AI / APU hardware.
    Several users have reported that 6.8 / 6.9 / 6.11 behave differently with the 8060S iGPU.
    If you’re on Ubuntu 25.10, testing an OEM kernel or LTS‑derived kernel can improve stability.
  2. Vulkan backend is often more stable than ROCm on Ryzen‑AI devices.
    Many APU systems avoid GPU hangs entirely when running: OLLAMA_VULKAN=1 ollama run
<!-- gh-comment-id:4079587261 --> @chejh-amd commented on GitHub (Mar 18, 2026): A few additional things that might help narrow this down: 1. Kernel version differences matter a lot on Ryzen‑AI / APU hardware. Several users have reported that 6.8 / 6.9 / 6.11 behave differently with the 8060S iGPU. If you’re on Ubuntu 25.10, testing an OEM kernel or LTS‑derived kernel can improve stability. 2. Vulkan backend is often more stable than ROCm on Ryzen‑AI devices. Many APU systems avoid GPU hangs entirely when running: `OLLAMA_VULKAN=1 ollama run`
Author
Owner

@rick-github commented on GitHub (Mar 20, 2026):

AMD recommends linux kernel 6.18.4 or newer for 8060S support.

<!-- gh-comment-id:4097477255 --> @rick-github commented on GitHub (Mar 20, 2026): AMD [recommends](https://rocm.docs.amd.com/en/latest/how-to/system-optimization/strixhalo.html#required-kernel-version) linux kernel 6.18.4 or newer for 8060S support.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#35299