[GH-ISSUE #8696] No more binaries? #5635

Closed
opened 2026-04-12 16:55:14 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @regularRandom on GitHub (Jan 30, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8696

What is the issue?

After the latest merge there is no ollama binary which can be used in Systemd service. There are only following .so:

  • libggml-base.so
  • libggml-cpu-alderlake.so
  • libggml-cpu-haswell.so
  • libggml-cpu-icelake.so
  • libggml-cpu-sandybridge.so
  • libggml-cpu-sapphirerapids.so
  • libggml-cpu-skylakex.so
  • libggml-cuda.so

I tried with the new build sequence (customized a bit):

cmake --install-prefix /opt/ollama --preset CUDA -B build
cmake --build --preset CUDA build

but only libraries above generated in the build directory, and that's all. Previous approach was something like that:

make -j -S cuda_avx2
go build .

As a result I was getting ./ollama binary and couple of runner's libs. And that was perfectly working as Systemd service. Now it doesn't.

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.5.7-7

Originally created by @regularRandom on GitHub (Jan 30, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8696 ### What is the issue? After the latest merge there is no ollama binary which can be used in Systemd service. There are only following .so: - libggml-base.so - libggml-cpu-alderlake.so - libggml-cpu-haswell.so - libggml-cpu-icelake.so - libggml-cpu-sandybridge.so - libggml-cpu-sapphirerapids.so - libggml-cpu-skylakex.so - libggml-cuda.so I tried with the new build sequence (customized a bit): ``` cmake --install-prefix /opt/ollama --preset CUDA -B build cmake --build --preset CUDA build ``` but only libraries above generated in the build directory, and that's all. Previous approach was something like that: ``` make -j -S cuda_avx2 go build . ``` As a result I was getting ./ollama binary and couple of runner's libs. And that was perfectly working as Systemd service. Now it doesn't. ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version >0.5.7-7
GiteaMirror added the bug label 2026-04-12 16:55:14 -05:00
Author
Owner

@jmorganca commented on GitHub (Jan 30, 2025):

Hi @regularRandom, go build . is still required as a separate step to build the Go binary. And so running:

cmake -B build
cmake --build build
go build .
./ollama serve

Will create the binary. The binary looks for the .so files under ../lib/ollama or build/lib/ollama (for development) relative to ollama

Sorry about that. Let me know if you're still hitting issues.

<!-- gh-comment-id:2624909393 --> @jmorganca commented on GitHub (Jan 30, 2025): Hi @regularRandom, `go build .` is still required as a separate step to build the Go binary. And so running: ``` cmake -B build cmake --build build go build . ./ollama serve ``` Will create the binary. The binary looks for the .so files under `../lib/ollama` or `build/lib/ollama` (for development) relative to `ollama` Sorry about that. Let me know if you're still hitting issues.
Author
Owner

@regularRandom commented on GitHub (Jan 30, 2025):

Yes, go build . made the trick. Probably it needs to be reflected in the developer guide.

Thank you.

<!-- gh-comment-id:2624922721 --> @regularRandom commented on GitHub (Jan 30, 2025): Yes, **go build .** made the trick. Probably it needs to be reflected in the developer guide. Thank you.
Author
Owner

@user-33948 commented on GitHub (Feb 1, 2025):

Hi @regularRandom, go build . is still required as a separate step to build the Go binary. And so running:

cmake -B build
cmake --build build
go build .
./ollama serve

Will create the binary. The binary looks for the .so files under ../lib/ollama or build/lib/ollama (for development) relative to ollama

Sorry about that. Let me know if you're still hitting issues.

I had a previous Docker image that relied on Ollama, so I had copy/pasted the ollama binary and the models into my docker image. I went to update the Ollama binary i had (403,976kb), and similarly noticed there is no longer an Ollama binary that gets downloaded when you download Ollama to Linux.

@jmorganca Given this -- what do I need to make this Ollama accessible in a custom Docker image? If there is no longer a large Binary file, what file (s) are needed to make ollama accessible / where do they get sent during the download process?

<!-- gh-comment-id:2629034699 --> @user-33948 commented on GitHub (Feb 1, 2025): > Hi [@regularRandom](https://github.com/regularRandom), `go build .` is still required as a separate step to build the Go binary. And so running: > > ``` > cmake -B build > cmake --build build > go build . > ./ollama serve > ``` > > Will create the binary. The binary looks for the .so files under `../lib/ollama` or `build/lib/ollama` (for development) relative to `ollama` > > Sorry about that. Let me know if you're still hitting issues. I had a previous Docker image that relied on Ollama, so I had copy/pasted the ollama binary and the models into my docker image. I went to update the Ollama binary i had (403,976kb), and similarly noticed there is no longer an Ollama binary that gets downloaded when you download Ollama to Linux. @jmorganca Given this -- what do I need to make this Ollama accessible in a custom Docker image? If there is no longer a large Binary file, what file (s) are needed to make ollama accessible / where do they get sent during the download process?
Author
Owner

@user-33948 commented on GitHub (Feb 22, 2025):

@jmorganca Wanted to follow up on the question above. Is there still a large (~400kb) Ollama binary that Ollama runs on? In subsequent updates if that has been removed, does it impact what would be required to build a docker image with Ollama?

<!-- gh-comment-id:2676220032 --> @user-33948 commented on GitHub (Feb 22, 2025): @jmorganca Wanted to follow up on the question above. Is there still a large (~400kb) Ollama binary that Ollama runs on? In subsequent updates if that has been removed, does it impact what would be required to build a docker image with Ollama?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5635