[GH-ISSUE #7011] ollama run llama3.2 --- Error: exception done_getting_tensors: wrong number of tensors; expected 255, got 254 #4442

Closed
opened 2026-04-12 15:22:25 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @andytriboletti on GitHub (Sep 27, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7011

What is the issue?

I ran ollama run llama3.2

I got this error:
Error: exception done_getting_tensors: wrong number of tensors; expected 255, got 254

Ollama version reports:
ollama -v
ollama version is 0.1.30
Warning: client version is 0.2.8

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.1.30

Originally created by @andytriboletti on GitHub (Sep 27, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7011 ### What is the issue? I ran ollama run llama3.2 I got this error: Error: exception done_getting_tensors: wrong number of tensors; expected 255, got 254 Ollama version reports: ollama -v ollama version is 0.1.30 Warning: client version is 0.2.8 ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.30
GiteaMirror added the bug label 2026-04-12 15:22:25 -05:00
Author
Owner

@0x00cl commented on GitHub (Sep 27, 2024):

Try updating your ollama version. Latest is v0.3.12 and you are running v0.1.30 and see if that works. I'm pretty sure it used to say on the website but as they add models they do require newer version of ollama for them to work.

Updating ollama is pretty much the same command as installing it.

<!-- gh-comment-id:2380273295 --> @0x00cl commented on GitHub (Sep 27, 2024): Try updating your ollama version. Latest is [v0.3.12](https://github.com/ollama/ollama/releases/tag/v0.3.12) and you are running v0.1.30 and see if that works. I'm pretty sure it used to say on the website but as they add models they do require newer version of ollama for them to work. [Updating ollama](https://github.com/ollama/ollama/blob/main/docs/linux.md#updating) is pretty much the same command as installing it.
Author
Owner

@andytriboletti commented on GitHub (Sep 28, 2024):

I ran curl -fsSL https://ollama.com/install.sh | sh on my wsl ubuntu and I still have the old version and a version mismatch.

(base) andy@andys-pc:~$ curl -fsSL https://ollama.com/install.sh | sh

Installing ollama to /usr/local
[sudo] password for andy:
Downloading Linux amd64 bundle
######################################################################## 100.0%##O#-#
######################################################################## 100.0%
Adding ollama user to render group...
Adding ollama user to video group...
Adding current user to ollama group...
Creating ollama systemd service...
Enabling and starting ollama service...
Nvidia GPU detected.
The Ollama API is now available at 127.0.0.1:11434.
Install complete. Run "ollama" from the command line.
(base) andy@andys-pc:$ ollama run llama3.2
Error: exception done_getting_tensors: wrong number of tensors; expected 255, got 254
(base) andy@andys-pc:
$ ollama --version
ollama version is 0.1.30
Warning: client version is 0.3.12

<!-- gh-comment-id:2380371782 --> @andytriboletti commented on GitHub (Sep 28, 2024): I ran curl -fsSL https://ollama.com/install.sh | sh on my wsl ubuntu and I still have the old version and a version mismatch. (base) andy@andys-pc:~$ curl -fsSL https://ollama.com/install.sh | sh >>> Installing ollama to /usr/local [sudo] password for andy: >>> Downloading Linux amd64 bundle ######################################################################## 100.0%##O#-# ######################################################################## 100.0% >>> Adding ollama user to render group... >>> Adding ollama user to video group... >>> Adding current user to ollama group... >>> Creating ollama systemd service... >>> Enabling and starting ollama service... >>> Nvidia GPU detected. >>> The Ollama API is now available at 127.0.0.1:11434. >>> Install complete. Run "ollama" from the command line. (base) andy@andys-pc:~$ ollama run llama3.2 Error: exception done_getting_tensors: wrong number of tensors; expected 255, got 254 (base) andy@andys-pc:~$ ollama --version ollama version is 0.1.30 Warning: client version is 0.3.12
Author
Owner

@0x00cl commented on GitHub (Sep 28, 2024):

At least it looks like the client is getting picked up correctly. As you can see you have updated from 0.2.8 to 0.3.12. The problem looks to be on the version being served as it still stuck in 0.1.30 maybe you have a different instance of ollama server running from the one being installed as a service.

If you run which ollama does the path match with the one in /etc/systemd/system/ollama.service in the line ExecStart=/usr/local/bin/ollama serve?

$ which ollama
/usr/local/bin/ollama
$ cat /etc/systemd/system/ollama.service | grep ExecStart=
ExecStart=/usr/local/bin/ollama serve

When you run ps aux | grep ollama Do you get the same path output s which ollama?

$ ps aux | grep '[o]llama'
ollama      ...      /usr/local/bin/ollama serve
<!-- gh-comment-id:2380583410 --> @0x00cl commented on GitHub (Sep 28, 2024): At least it looks like the client is getting picked up correctly. As you can see you have updated from 0.2.8 to 0.3.12. The problem looks to be on the version being served as it still stuck in 0.1.30 **maybe you have a different instance of ollama server running from the one being installed as a service**. If you run `which ollama` does the path match with the one in `/etc/systemd/system/ollama.service` in the line `ExecStart=/usr/local/bin/ollama serve`? ```shell $ which ollama /usr/local/bin/ollama $ cat /etc/systemd/system/ollama.service | grep ExecStart= ExecStart=/usr/local/bin/ollama serve ``` When you run `ps aux | grep ollama` Do you get the same path output s which ollama? ```shell $ ps aux | grep '[o]llama' ollama ... /usr/local/bin/ollama serve ```
Author
Owner

@andytriboletti commented on GitHub (Sep 28, 2024):

Here are the commands

(base) andy@andys-pc:$ which ollama
/usr/local/bin/ollama
(base) andy@andys-pc:
$ cat /etc/systemd/system/ollama.service | grep ExecStart=
ExecStart=/usr/local/bin/ollama serve

(base) andy@andys-pc:~$ ps aux | grep '[o]llama'
root 161 0.6 0.0 302464 11416 ? Ssl 11:36 0:00 snapfuse /var/lib/snapd/snaps/ollama_15.snap /snap/ollama/15 -o ro,nodev,allow_other,suid
ollama 437 24.4 3.2 59108548 1067088 ? Ssl 11:36 0:15 /usr/local/bin/ollama serve

<!-- gh-comment-id:2380712322 --> @andytriboletti commented on GitHub (Sep 28, 2024): Here are the commands (base) andy@andys-pc:~$ which ollama /usr/local/bin/ollama (base) andy@andys-pc:~$ cat /etc/systemd/system/ollama.service | grep ExecStart= ExecStart=/usr/local/bin/ollama serve (base) andy@andys-pc:~$ ps aux | grep '[o]llama' root 161 0.6 0.0 302464 11416 ? Ssl 11:36 0:00 snapfuse /var/lib/snapd/snaps/ollama_15.snap /snap/ollama/15 -o ro,nodev,allow_other,suid ollama 437 24.4 3.2 59108548 1067088 ? Ssl 11:36 0:15 /usr/local/bin/ollama serve
Author
Owner

@0x00cl commented on GitHub (Sep 28, 2024):

You might have installed ollama using snap. You can check this by executing snap list ollama. Since the recommended way to install is using the command provided in the docs, I'd suggest you to uninstall ollama from snap and re-installing using the curl command. It might be possible that when uninstalling ollama from snap you lose the models installed and other configuration, though I haven't tried snap.

If you care about losing the models installed and any configuration you could try stopping the service in snap and restarting the one installed with systemd.

  • Only stopping snap ollama service and restarting the one installed in your machine.

    $ snap stop ollama
    Stopped.
    $ sudo systemctl restart ollama
    $ ollama --version
    

    Mind you that next time you restart your computer snap ollama will start again, so you'll have to stop it again.

  • Removing ollama from snap.

    $ snap remove ollama
    Stopped.
    $ sudo systemctl restart ollama
    $ ollama --version
    
<!-- gh-comment-id:2380887176 --> @0x00cl commented on GitHub (Sep 28, 2024): You might have installed ollama using `snap`. You can check this by executing `snap list ollama`. Since the recommended way to install is using the command provided in the docs, I'd suggest you to uninstall ollama from snap and re-installing using the curl command. It might be possible that when uninstalling ollama from snap you lose the models installed and other configuration, though I haven't tried snap. If you care about losing the models installed and any configuration you could try stopping the service in snap and restarting the one installed with systemd. - Only stopping snap ollama service and restarting the one installed in your machine. ``` $ snap stop ollama Stopped. $ sudo systemctl restart ollama $ ollama --version ``` Mind you that next time you restart your computer snap ollama will start again, so you'll have to stop it again. - Removing ollama from snap. ``` $ snap remove ollama Stopped. $ sudo systemctl restart ollama $ ollama --version ```
Author
Owner

@pdevine commented on GitHub (Oct 3, 2024):

I'll go ahead and close this. @andytriboletti You should be able to stop the server and restart. Make sure both your client and server are updated to the newest versions.

<!-- gh-comment-id:2392210911 --> @pdevine commented on GitHub (Oct 3, 2024): I'll go ahead and close this. @andytriboletti You should be able to stop the server and restart. Make sure both your client and server are updated to the newest versions.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4442