[GH-ISSUE #6443] Error: llama runner process no longer running: -1 #4052

Closed
opened 2026-04-12 14:56:49 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @ZINE-KHER on GitHub (Aug 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6443

What is the issue?

Hi,

I am facing the below error when trying to run ollama models (both llama3.1:8b-instruct-q4_1 and llama3.1:8b-instruct-fp16):
Error: llama runner process no longer running: -1

After checking syslog file, I found the following issue:
ollama.listener llama_model_load: error loading model ... wrong number of tensors; expected 292, got 291

I have ollama==0.3.1 installed using pip. I also tried installing latest ollama-linux-amd64 version 0.3.6 using binaries (this version is not, to the best of my knowledge, available using pip), but I got same errors.

These are my specs:
OS: ubuntu 22.04.4 LTS
GPU: Nvidia
CPU: Intel
CUDA: 11.5.119

Do you have any suggestions ?

Thank you.

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.3.1, 0.3.6

Originally created by @ZINE-KHER on GitHub (Aug 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6443 ### What is the issue? Hi, I am facing the below error when trying to run ollama models (both llama3.1:8b-instruct-q4_1 and llama3.1:8b-instruct-fp16): **Error: llama runner process no longer running: -1** After checking **syslog** file, I found the following issue: **ollama.listener llama_model_load: error loading model ... wrong number of tensors; expected 292, got 291** I have **ollama==0.3.1** installed using pip. I also tried installing latest **ollama-linux-amd64** version **0.3.6** using binaries (this version is not, to the best of my knowledge, available using pip), but I got same errors. These are my specs: **OS:** ubuntu 22.04.4 LTS **GPU:** Nvidia **CPU:** Intel **CUDA:** 11.5.119 Do you have any suggestions ? Thank you. ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.1, 0.3.6
GiteaMirror added the bug label 2026-04-12 14:56:49 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 20, 2024):

The tensor problem was resolved in version 0.3.1, so it's surprising that you are seeing this. If you upgraded to 0.3.6 and are still seeing the problem, then it's possible that your upgrade was not successful. What does ollama -v show? If it shows two version numbers, one for the client and the other for the server, then restarting the service might help. If it doesn't, I'd suggest removing and re-installing ollama.

<!-- gh-comment-id:2299154441 --> @rick-github commented on GitHub (Aug 20, 2024): The tensor problem was resolved in version 0.3.1, so it's surprising that you are seeing this. If you upgraded to 0.3.6 and are still seeing the problem, then it's possible that your upgrade was not successful. What does `ollama -v` show? If it shows two version numbers, one for the client and the other for the server, then restarting the service might help. If it doesn't, I'd suggest removing and re-installing ollama.
Author
Owner

@ZINE-KHER commented on GitHub (Aug 20, 2024):

ollama --version shows "version is 0.0.0" when installing 0.3.1 using pip, and it shows "version is 0.0.0 \n Warning: Client version 0.3.6" when installing 0.3.6 using binaries. "pip show ollama" shows the right versions.
I tried removing ollama, using pip uninstall and rm for binaries, but I got same issues when re-installing.

@rick-github For your suggestion on restarting the service, should I use "sudo systemctl restart ollama.service" ?

<!-- gh-comment-id:2299252460 --> @ZINE-KHER commented on GitHub (Aug 20, 2024): **ollama --version** shows **"version is 0.0.0"** when installing 0.3.1 using pip, and it shows **"version is 0.0.0 \n Warning: Client version 0.3.6"** when installing 0.3.6 using binaries. **"pip show ollama"** shows the right versions. I tried removing ollama, using pip uninstall and rm for binaries, but I got same issues when re-installing. @rick-github For your suggestion on restarting the service, should I use **"sudo systemctl restart ollama.service"** ?
Author
Owner

@rick-github commented on GitHub (Aug 20, 2024):

That would do it in a normal installation, I don't know about a pip installation. I recommend uninstalling the pip version and running curl -fsSL https://ollama.com/install.sh | sh

<!-- gh-comment-id:2299274807 --> @rick-github commented on GitHub (Aug 20, 2024): That would do it in a normal installation, I don't know about a pip installation. I recommend uninstalling the pip version and running `curl -fsSL https://ollama.com/install.sh | sh`
Author
Owner

@ZINE-KHER commented on GitHub (Aug 21, 2024):

It is now working without issues after manually removing some /etc/systemd/system/snap-ollama-15.mount and /etc/systemd/system/snap.ollama.listener.service !!

<!-- gh-comment-id:2302057896 --> @ZINE-KHER commented on GitHub (Aug 21, 2024): It is now working without issues after manually removing some **/etc/systemd/system/snap-ollama-15.mount** and **/etc/systemd/system/snap.ollama.listener.service** !!
Author
Owner

@wangixt commented on GitHub (Aug 22, 2024):

It is now working without issues after manually removing some /etc/systemd/system/snap-ollama-15.mount and /etc/systemd/system/snap.ollama.listener.service !!
I have also encountered the same problem.I would like to know which files have been removed,thank you.

<!-- gh-comment-id:2303373269 --> @wangixt commented on GitHub (Aug 22, 2024): > It is now working without issues after manually removing some **/etc/systemd/system/snap-ollama-15.mount** and **/etc/systemd/system/snap.ollama.listener.service** !! I have also encountered the same problem.I would like to know which files have been removed,thank you.
Author
Owner

@ZINE-KHER commented on GitHub (Aug 22, 2024):

@wangixt Those are ollama services that were, to the best of my knowledge, persisted from a previous snap installation.
Just check in "/etc/systemd/system/" for those kind of services.
For me I did the following to get rid of them:

sudo systemctl stop snap-ollama-15.mount
sudo systemctl disable snap-ollama-15.mount
sudo rm /etc/systemd/system/snap-ollama-15.mount

sudo systemctl stop snap.ollama.listener.service
sudo systemctl disable snap.ollama.listener.service
sudo rm /etc/systemd/system/snap.ollama.listener.service

<!-- gh-comment-id:2303841003 --> @ZINE-KHER commented on GitHub (Aug 22, 2024): @wangixt Those are ollama services that were, to the best of my knowledge, persisted from a previous snap installation. Just check in **"/etc/systemd/system/"** for those kind of services. For me I did the following to get rid of them: **sudo systemctl stop snap-ollama-15.mount** **sudo systemctl disable snap-ollama-15.mount** **sudo rm /etc/systemd/system/snap-ollama-15.mount** **sudo systemctl stop snap.ollama.listener.service** **sudo systemctl disable snap.ollama.listener.service** **sudo rm /etc/systemd/system/snap.ollama.listener.service**
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4052