[GH-ISSUE #7278] llama3.2:latest not running and giving Error: llama runner process no longer running: -1 #51135

Closed
opened 2026-04-28 18:31:27 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @ishu121992 on GitHub (Oct 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7278

What is the issue?

I have been using Ollama for a while and have never encountered this error while running any other llms (including llama3.1).

Below is the snapshot of server log with error:
image

Key issue seems to be related to wrong number of tensors. Any help? I have a 3070Ti GPU with 8 GB VRAM.

OS

WSL2

GPU

Nvidia

CPU

Intel

Ollama version

0.1.32

Originally created by @ishu121992 on GitHub (Oct 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7278 ### What is the issue? I have been using Ollama for a while and have never encountered this error while running any other llms (including llama3.1). Below is the snapshot of server log with error: ![image](https://github.com/user-attachments/assets/6b2975aa-7d2f-4e76-ab22-1ca7bf0fa147) Key issue seems to be related to wrong number of tensors. Any help? I have a 3070Ti GPU with 8 GB VRAM. ### OS WSL2 ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.32
GiteaMirror added the bug label 2026-04-28 18:31:27 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 21, 2024):

Upgrade ollama, 0.1.32 is too old to run llama3.2.

<!-- gh-comment-id:2425368391 --> @rick-github commented on GitHub (Oct 21, 2024): Upgrade ollama, 0.1.32 is too old to run llama3.2.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#51135