[GH-ISSUE #2579] Use pkg-config to find llama-cpp libs #48029

Closed
opened 2026-04-28 06:28:40 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @happysalada on GitHub (Feb 18, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2579

Hi first of all, thank you for ollama, its amazing !

Im one of the person trying to maintain this on nixos. Since 0.1.19 i think llama-cpp is directly included as a submodule in the repo. Also the .so objects are directly referenced by path in the go build files. Would it be possible to use pkg-config to find those files ?
This would make building ollama from source much easier as llama-cpp can just be specified as a dependency.

I understand that you might have a million more important things to do, i just want to start the discussion.

Originally created by @happysalada on GitHub (Feb 18, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2579 Hi first of all, thank you for ollama, its amazing ! Im one of the person trying to maintain this on nixos. Since 0.1.19 i think llama-cpp is directly included as a submodule in the repo. Also the .so objects are directly referenced by path in the go build files. Would it be possible to use pkg-config to find those files ? This would make building ollama from source much easier as llama-cpp can just be specified as a dependency. I understand that you might have a million more important things to do, i just want to start the discussion.
GiteaMirror added the feature request label 2026-04-28 06:28:40 -05:00
Author
Owner

@dhiltgen commented on GitHub (Oct 23, 2024):

We've switched to a vendored model for llama.cpp as we need to apply patches for it to work with Ollama. Using an unmodified upstream version of llama.cpp wont be possible. You should give the new build process a try.

https://github.com/ollama/ollama/blob/main/docs/development.md#transition-to-go-runner

<!-- gh-comment-id:2433010076 --> @dhiltgen commented on GitHub (Oct 23, 2024): We've switched to a vendored model for llama.cpp as we need to apply patches for it to work with Ollama. Using an unmodified upstream version of llama.cpp wont be possible. You should give the new build process a try. https://github.com/ollama/ollama/blob/main/docs/development.md#transition-to-go-runner
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48029