[GH-ISSUE #1503] Invalid Opcode Error in Ubuntu Server #62851

Closed
opened 2026-05-03 10:30:23 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @Gyarados on GitHub (Dec 13, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1503

When trying to run any model in Ubuntu Server, locally and in a container, I get the following messages in the Ollama logs:

$ journalctl -u ollama -f
Dec 13 15:28:54 desimachine ollama[1471335]: 2023/12/13 15:28:54 download.go:123: downloading 58e1b82a691f in 1 18 B part(s)
Dec 13 15:28:58 desimachine ollama[1471335]: 2023/12/13 15:28:58 download.go:123: downloading 658e00cf526b in 1 529 B part(s)
Dec 13 15:29:09 desimachine ollama[1471335]: [GIN] 2023/12/13 - 15:29:09 | 200 |         2m53s |       127.0.0.1 | POST     "/api/pull"
Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:397: skipping accelerated runner because num_gpu=0
Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:434: starting llama runner
Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:492: waiting for llama runner to start responding
Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:449: signal: illegal instruction (core dumped)
Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:457: error starting llama runner: llama runner process has terminated
Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:523: llama runner stopped successfully

And this is the log from the kernel:

$ sudo dmesg
...
[67864.232068] traps: ollama-runner[1485327] trap invalid opcode ip:5080dc sp:7ffd98094950 error:0 in ollama-runner[408000+16d000]
...

This is my OS version:

$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 22.04.3 LTS
Release:        22.04
Codename:       jammy

My CPU is an Intel Celeron N4020.

Couldn't find much information about this online, other than the fact that this error message from the kernel is about an invalid opcode in the instruction, meaning it is not implemented by the CPU designer.

I also have a Windows laptop with an i7 where ollama worked perfectly using Docker.

Any tips?

Originally created by @Gyarados on GitHub (Dec 13, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1503 When trying to run any model in Ubuntu Server, locally and in a container, I get the following messages in the Ollama logs: ``` $ journalctl -u ollama -f Dec 13 15:28:54 desimachine ollama[1471335]: 2023/12/13 15:28:54 download.go:123: downloading 58e1b82a691f in 1 18 B part(s) Dec 13 15:28:58 desimachine ollama[1471335]: 2023/12/13 15:28:58 download.go:123: downloading 658e00cf526b in 1 529 B part(s) Dec 13 15:29:09 desimachine ollama[1471335]: [GIN] 2023/12/13 - 15:29:09 | 200 | 2m53s | 127.0.0.1 | POST "/api/pull" Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:397: skipping accelerated runner because num_gpu=0 Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:434: starting llama runner Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:492: waiting for llama runner to start responding Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:449: signal: illegal instruction (core dumped) Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:457: error starting llama runner: llama runner process has terminated Dec 13 15:29:10 desimachine ollama[1471335]: 2023/12/13 15:29:10 llama.go:523: llama runner stopped successfully ``` And this is the log from the kernel: ``` $ sudo dmesg ... [67864.232068] traps: ollama-runner[1485327] trap invalid opcode ip:5080dc sp:7ffd98094950 error:0 in ollama-runner[408000+16d000] ... ``` This is my OS version: ``` $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 22.04.3 LTS Release: 22.04 Codename: jammy ``` My CPU is an Intel Celeron N4020. Couldn't find much information about this online, other than the fact that this error message from the kernel is about an invalid opcode in the instruction, meaning it is not implemented by the CPU designer. I also have a Windows laptop with an i7 where ollama worked perfectly using Docker. Any tips?
Author
Owner

@easp commented on GitHub (Dec 13, 2023):

That CPU doesn't support AVX instructions, which are currently required by Ollama. https://www.intel.com/content/www/us/en/products/sku/197310/intel-celeron-processor-n4020-4m-cache-up-to-2-80-ghz/specifications.html

If you search I think someone with the same problem posted how they built ollama without AVX instructions.

<!-- gh-comment-id:1854384533 --> @easp commented on GitHub (Dec 13, 2023): That CPU doesn't support AVX instructions, which are currently required by Ollama. https://www.intel.com/content/www/us/en/products/sku/197310/intel-celeron-processor-n4020-4m-cache-up-to-2-80-ghz/specifications.html If you search I think someone with the same problem posted how they built ollama without AVX instructions.
Author
Owner

@Gyarados commented on GitHub (Dec 13, 2023):

I should have searched better before opening this issue! Thanks a lot!

<!-- gh-comment-id:1854412870 --> @Gyarados commented on GitHub (Dec 13, 2023): I should have searched better before opening this issue! Thanks a lot!
Author
Owner

@Gyarados commented on GitHub (Dec 13, 2023):

Closing for https://github.com/jmorganca/ollama/issues/1279

<!-- gh-comment-id:1854415145 --> @Gyarados commented on GitHub (Dec 13, 2023): Closing for https://github.com/jmorganca/ollama/issues/1279
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#62851