[GH-ISSUE #13819] Really bad performances on Raspberry Pi 5 for 0.14.3-RC3 #34811

Closed
opened 2026-04-22 18:42:21 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @Oneil974 on GitHub (Jan 21, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/13819

What is the issue?

Hello,

I tested simple models on Raspberry Pi 5 with 0.14.3-RC3 and I'm getting around 2.6 tokens/s instead 26 tokens/s.

Models tested :
lfm2.5-thinking:1.2b
gemma3:270M
qwen3:1.7b
qwen3:0.6b

Everything's good when rolling back to 0.14.2

Best regards

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @Oneil974 on GitHub (Jan 21, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/13819 ### What is the issue? Hello, I tested simple models on Raspberry Pi 5 with 0.14.3-RC3 and I'm getting around 2.6 tokens/s instead 26 tokens/s. Models tested : lfm2.5-thinking:1.2b gemma3:270M qwen3:1.7b qwen3:0.6b Everything's good when rolling back to 0.14.2 Best regards ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-22 18:42:21 -05:00
Author
Owner

@rick-github commented on GitHub (Jan 21, 2026):

Server log may aid in debugging.

<!-- gh-comment-id:3777682427 --> @rick-github commented on GitHub (Jan 21, 2026): [Server log](https://docs.ollama.com/troubleshooting) may aid in debugging.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#34811