[GH-ISSUE #13100] Allow usage of the already-installed llama.cpp when building #34431

Open
opened 2026-04-22 17:59:52 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @UnitedMarsupials on GitHub (Nov 15, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13100

At this moment it is not possible to build ollama using the already installed llama.cpp -- the build process insists on using the bundled sources for that instead.

Whether such bundling is a good idea at all or not, it should be possible to use the already-installed llama.cpp.

Originally created by @UnitedMarsupials on GitHub (Nov 15, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13100 At this moment it is not possible to build ollama using the _already installed_ llama.cpp -- the build process insists on using the bundled sources for that instead. Whether such bundling is a good idea at all or not, it should be possible to use the already-installed llama.cpp.
GiteaMirror added the feature request label 2026-04-22 17:59:52 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#34431