[GH-ISSUE #2000] Issue with Ollama on Ubuntu 22.04 under VirtualBox 7 Windows 11 #1153

Closed
opened 2026-04-12 10:55:00 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @dekogroup on GitHub (Jan 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2000

Originally assigned to: @dhiltgen on GitHub.

On this platform, Ollama was installed successfully but got following error when running:

ollama run codellama:7b-instruct

Illegal instruction (core dumped)

Originally created by @dekogroup on GitHub (Jan 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2000 Originally assigned to: @dhiltgen on GitHub. On this platform, Ollama was installed successfully but got following error when running: ollama run codellama:7b-instruct Illegal instruction (core dumped)
Author
Owner

@Rushmore75 commented on GitHub (Jan 19, 2024):

Getting same result, on any command, I'm using Proxmox 8.1.3 tho

<!-- gh-comment-id:1899655135 --> @Rushmore75 commented on GitHub (Jan 19, 2024): Getting same result, on any command, I'm using Proxmox 8.1.3 tho
Author
Owner

@Rushmore75 commented on GitHub (Jan 19, 2024):

@dekogroup try building from source and bumping up the version of the mimetype depdendancy

<!-- gh-comment-id:1899822585 --> @Rushmore75 commented on GitHub (Jan 19, 2024): @dekogroup try building from source and bumping up the version of the mimetype depdendancy
Author
Owner

@dekogroup commented on GitHub (Jan 19, 2024):

Thank you @Rushmore75 . Will try.

<!-- gh-comment-id:1900161334 --> @dekogroup commented on GitHub (Jan 19, 2024): Thank you @Rushmore75 . Will try.
Author
Owner

@dhiltgen commented on GitHub (Jan 26, 2024):

@dekogroup my suspicion is you have VirtualBox configured to mask CPU features like AVX and AVX2. In older builds, we'd crash with illegal instruction. We have recently added support for running on CPUs without these vector math extensions, but if you can update your VirtualBox configuration, I'd recommend that as it will result in better performance if we have to do any inference on the CPU.

<!-- gh-comment-id:1912715880 --> @dhiltgen commented on GitHub (Jan 26, 2024): @dekogroup my suspicion is you have VirtualBox configured to mask CPU features like AVX and AVX2. In older builds, we'd crash with illegal instruction. We have recently added support for running on CPUs without these vector math extensions, but if you can update your VirtualBox configuration, I'd recommend that as it will result in better performance if we have to do any inference on the CPU.
Author
Owner

@orlyandico commented on GitHub (Feb 3, 2024):

Not an exact answer, but Ollama works great on WSL2 with Ubuntu 22.04 on Windows 11. That's HyperV underneath. GPU works fine and there's no performance hit that I can measure.

<!-- gh-comment-id:1925088244 --> @orlyandico commented on GitHub (Feb 3, 2024): Not an exact answer, but Ollama works great on WSL2 with Ubuntu 22.04 on Windows 11. That's HyperV underneath. GPU works fine and there's no performance hit that I can measure.
Author
Owner

@dhiltgen commented on GitHub (Feb 19, 2024):

Recent builds will no longer crash, but will not execute on the GPU due to lacking AVX support. Potentially adding non-AVX support to the GPU builds is tracked via issue #2187

<!-- gh-comment-id:1953082116 --> @dhiltgen commented on GitHub (Feb 19, 2024): Recent builds will no longer crash, but will not execute on the GPU due to lacking AVX support. Potentially adding non-AVX support to the GPU builds is tracked via issue #2187
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1153