[GH-ISSUE #1537] Error getting mixtral #838

Closed
opened 2026-04-12 10:30:21 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @JimBabcock59 on GitHub (Dec 15, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1537

Like everyone, we wanted to try mixtral. I downloaded it. Below is my experience:

ollama run mixtral
pulling manifest
pulling 9cd37fe774bf... 100% ▕█████████████████████████████████████████████████████▏ (26/26 GB, 4.7 MB/s)
pulling 79b7eca19f7a... 100% ▕██████████████████████████████████████████████████████████████▏ (43/43 B, 34 B/s)
pulling f427553a0d85... 100% ▕██████████████████████████████████████████████████████████████▏ (63/63 B, 46 B/s)
pulling 302c7acb7ff7... 100% ▕███████████████████████████████████████████████████████████▏ (410/410 B, 195 B/s)
verifying sha256 digest
writing manifest
removing any unused layers
success
⠏ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:9cd37fe774bf3be341e1ff913a18518cee43d2350ee7107035b5a3a27468c0d4': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running ollama pull mixtral:latest
jim@Jim:$ ollama run mixtral
⠸ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:9cd37fe774bf3be341e1ff913a18518cee43d2350ee7107035b5a3a27468c0d4': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running ollama pull mixtral:latest
jim@Jim:
$ ollama pull mixtral:latest
pulling manifest
pulling 9cd37fe774bf... 100% ▕███████████████████████████████████████████████████████████▏ (26/26 GB, 284 TB/s)
pulling 79b7eca19f7a... 100% ▕████████████████████████████████████████████████████████████▏ (43/43 B, 1.9 MB/s)
pulling f427553a0d85... 100% ▕████████████████████████████████████████████████████████████▏ (63/63 B, 2.9 MB/s)
pulling 302c7acb7ff7... 100% ▕██████████████████████████████████████████████████████████▏ (410/410 B, 9.9 MB/s)
verifying sha256 digest
writing manifest
removing any unused layers
success
jim@Jim:$ ollama run mixtral:latest
⠙ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:9cd37fe774bf3be341e1ff913a18518cee43d2350ee7107035b5a3a27468c0d4': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running ollama pull mixtral:latest
jim@Jim:
$ ollama pull mixtral:latest
pulling manifest
pulling 9cd37fe774bf... 100% ▕████████████████████████████████████████████████████████████▏ (26/26 GB, 75 TB/s)
pulling 79b7eca19f7a... 100% ▕████████████████████████████████████████████████████████████▏ (43/43 B, 965 kB/s)
pulling f427553a0d85... 100% ▕███████████████���████████████████████████████████████████████▏ (63/63 B, 557 kB/s)
pulling 302c7acb7ff7... 100% ▕██████████████████████████████████████████████████████████▏ (410/410 B, 9.4 MB/s)
verifying sha256 digest
writing manifest
removing any unused layers
success
jim@Jim:~$ ollama run mixtral
⠙ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:9cd37fe774bf3be341e1ff913a18518cee43d2350ee7107035b5a3a27468c0d4': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running ollama pull mixtral:latest

I have successfully downloaded literally dozens of models through the ollama site and never an issue. Any suggestions?

Originally created by @JimBabcock59 on GitHub (Dec 15, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1537 Like everyone, we wanted to try mixtral. I downloaded it. Below is my experience: ollama run mixtral pulling manifest pulling 9cd37fe774bf... 100% ▕█████████████████████████████████████████████████████▏ (26/26 GB, 4.7 MB/s) pulling 79b7eca19f7a... 100% ▕██████████████████████████████████████████████████████████████▏ (43/43 B, 34 B/s) pulling f427553a0d85... 100% ▕██████████████████████████████████████████████████████████████▏ (63/63 B, 46 B/s) pulling 302c7acb7ff7... 100% ▕███████████████████████████████████████████████████████████▏ (410/410 B, 195 B/s) verifying sha256 digest writing manifest removing any unused layers success ⠏ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:9cd37fe774bf3be341e1ff913a18518cee43d2350ee7107035b5a3a27468c0d4': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running `ollama pull mixtral:latest` jim@Jim:~$ ollama run mixtral ⠸ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:9cd37fe774bf3be341e1ff913a18518cee43d2350ee7107035b5a3a27468c0d4': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running `ollama pull mixtral:latest` jim@Jim:~$ ollama pull mixtral:latest pulling manifest pulling 9cd37fe774bf... 100% ▕███████████████████████████████████████████████████████████▏ (26/26 GB, 284 TB/s) pulling 79b7eca19f7a... 100% ▕████████████████████████████████████████████████████████████▏ (43/43 B, 1.9 MB/s) pulling f427553a0d85... 100% ▕████████████████████████████████████████████████████████████▏ (63/63 B, 2.9 MB/s) pulling 302c7acb7ff7... 100% ▕██████████████████████████████████████████████████████████▏ (410/410 B, 9.9 MB/s) verifying sha256 digest writing manifest removing any unused layers success jim@Jim:~$ ollama run mixtral:latest ⠙ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:9cd37fe774bf3be341e1ff913a18518cee43d2350ee7107035b5a3a27468c0d4': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running `ollama pull mixtral:latest` jim@Jim:~$ ollama pull mixtral:latest pulling manifest pulling 9cd37fe774bf... 100% ▕████████████████████████████████████████████████████████████▏ (26/26 GB, 75 TB/s) pulling 79b7eca19f7a... 100% ▕████████████████████████████████████████████████████████████▏ (43/43 B, 965 kB/s) pulling f427553a0d85... 100% ▕███████████████���████████████████████████████████████████████▏ (63/63 B, 557 kB/s) pulling 302c7acb7ff7... 100% ▕██████████████████████████████████████████████████████████▏ (410/410 B, 9.4 MB/s) verifying sha256 digest writing manifest removing any unused layers success jim@Jim:~$ ollama run mixtral ⠙ Error: llama runner: failed to load model '/usr/share/ollama/.ollama/models/blobs/sha256:9cd37fe774bf3be341e1ff913a18518cee43d2350ee7107035b5a3a27468c0d4': this model may be incompatible with your version of Ollama. If you previously pulled this model, try updating it by running `ollama pull mixtral:latest` I have successfully downloaded literally dozens of models through the ollama site and never an issue. Any suggestions?
Author
Owner

@jmorganca commented on GitHub (Dec 15, 2023):

Hello, sorry you hit an error! Which version of Ollama are you running? Mixtral requires Ollama 0.1.16 or later – sorry this isn't more obvious from the error

<!-- gh-comment-id:1857198524 --> @jmorganca commented on GitHub (Dec 15, 2023): Hello, sorry you hit an error! Which version of Ollama are you running? Mixtral requires [Ollama 0.1.16](https://github.com/jmorganca/ollama/releases/tag/v0.1.16) or later – sorry this isn't more obvious from the error
Author
Owner

@igorschlum commented on GitHub (Dec 15, 2023):

@JimBabcock59 it work for me with 0.1.16 and there are already thousands of Downloads
Try to update to version 0.1.16 and if it works, please close the issue :-)

<!-- gh-comment-id:1858251825 --> @igorschlum commented on GitHub (Dec 15, 2023): @JimBabcock59 it work for me with 0.1.16 and there are already thousands of Downloads Try to update to version 0.1.16 and if it works, please close the issue :-)
Author
Owner

@technovangelist commented on GitHub (Dec 19, 2023):

hi @JimBabcock59 It looks like Jeff and Igor's comments should solve your issue. I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.

<!-- gh-comment-id:1863417497 --> @technovangelist commented on GitHub (Dec 19, 2023): hi @JimBabcock59 It looks like Jeff and Igor's comments should solve your issue. I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#838