[GH-ISSUE #9684] ollama 0.6.0 can't run gemma3 on AMD Ryzen 5 3400G CPU #32080

Closed
opened 2026-04-22 12:59:34 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @lesshaste on GitHub (Mar 12, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9684

What is the issue?

This is the full set of steps I have taken and the problem:

curl -fsSL https://ollama.com/install.sh | sh
>>> Installing ollama to /usr/local
[sudo] password for user: 
>>> Downloading Linux amd64 bundle
######################################################################## 100.0%
>>> Creating ollama user...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> Downloading Linux ROCm amd64 bundle
######################################################################## 100.0%
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
>>> AMD GPU ready.
(base) user@user-home:~/Downloads$ ollama run gemma3:12b
Error: llama runner process has terminated: this model is not supported by your version of Ollama. You may need to upgrade

I also see this:

ollama -v
ollama version is 0.0.0
Warning: client version is 0.6.0

I can run other models.

Relevant log output

See above

OS

Distributor ID: Ubuntu
Description: Ubuntu 24.04.2 LTS
Release: 24.04

GPU

No GPU

CPU

AMD Ryzen 5 3400G CPU

Ollama version

0.6.0

Originally created by @lesshaste on GitHub (Mar 12, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9684 ### What is the issue? This is the full set of steps I have taken and the problem: ``` curl -fsSL https://ollama.com/install.sh | sh >>> Installing ollama to /usr/local [sudo] password for user: >>> Downloading Linux amd64 bundle ######################################################################## 100.0% >>> Creating ollama user... >>> Adding ollama user to render group... >>> Adding ollama user to video group... >>> Adding current user to ollama group... >>> Creating ollama systemd service... >>> Enabling and starting ollama service... Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service. >>> Downloading Linux ROCm amd64 bundle ######################################################################## 100.0% >>> The Ollama API is now available at 127.0.0.1:11434. >>> Install complete. Run "ollama" from the command line. >>> AMD GPU ready. (base) user@user-home:~/Downloads$ ollama run gemma3:12b Error: llama runner process has terminated: this model is not supported by your version of Ollama. You may need to upgrade ``` I also see this: ``` ollama -v ollama version is 0.0.0 Warning: client version is 0.6.0 ``` I can run other models. ### Relevant log output ``` See above ``` ### OS Distributor ID: Ubuntu Description: Ubuntu 24.04.2 LTS Release: 24.04 ### GPU No GPU ### CPU AMD Ryzen 5 3400G CPU ### Ollama version 0.6.0
GiteaMirror added the bug label 2026-04-22 12:59:34 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 12, 2025):

You are running a locally built version of ollama, that's why the version is "0.0.0". Try this: sudo kill $(pidof ollama) and then ollama -v.

<!-- gh-comment-id:2717659143 --> @rick-github commented on GitHub (Mar 12, 2025): You are running a locally built version of ollama, that's why the version is "0.0.0". Try this: `sudo kill $(pidof ollama)` and then `ollama -v`.
Author
Owner

@lesshaste commented on GitHub (Mar 12, 2025):

Thank you. That solves the problem! Please close the issue.

<!-- gh-comment-id:2717669265 --> @lesshaste commented on GitHub (Mar 12, 2025): Thank you. That solves the problem! Please close the issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#32080