[GH-ISSUE #4069] Support to build llama.cpp with Intel oneMKL #64563

Closed
opened 2026-05-03 18:08:21 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @MarkWard0110 on GitHub (May 1, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4069

What would be required to build with Intel oneMKL?

It seems this is how avx_vnni instruction set is made available for Intel processors that do not support AVX512. The Intel Core i9 14900k does not support AVX512.

It also references Intel's publication Optimizing and Running LLaMA2 on Intel CPU

There is also this AVX2 is dimwitted compared to AVX512. How do different builds affect how models perform on Intel CPU?

What if llama.cpp build configuration + quantized models is a double down on poorly performing models?

Originally created by @MarkWard0110 on GitHub (May 1, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4069 What would be required to build with [Intel oneMKL](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#intel-onemkl)? It seems this is how `avx_vnni` instruction set is made available for Intel processors that do not support AVX512. The Intel Core i9 14900k does not support AVX512. It also references Intel's publication [Optimizing and Running LLaMA2 on Intel CPU](https://www.intel.com/content/www/us/en/content-details/791610/optimizing-and-running-llama2-on-intel-cpu.html) There is also this [AVX2 is dimwitted compared to AVX512](https://github.com/google/gemma.cpp/issues/23). How do different builds affect how models perform on Intel CPU? What if `llama.cpp` build configuration + quantized models is a double down on poorly performing models?
GiteaMirror added the intelfeature request labels 2026-05-03 18:08:21 -05:00
Author
Owner

@dhiltgen commented on GitHub (Jul 25, 2024):

Intel GPU support is being tracked via #1590

AVX512 (and friends) is tracked via #2205

<!-- gh-comment-id:2250978698 --> @dhiltgen commented on GitHub (Jul 25, 2024): Intel GPU support is being tracked via #1590 AVX512 (and friends) is tracked via #2205
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64563