v0.12.6 Container Build Failure on Linux #8515

Closed
opened 2025-11-12 14:44:31 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @zchef2k on GitHub (Oct 29, 2025).

What is the issue?

Downloaded the zipped source from the release page. Made no edits to Dockerfile, build script or envs.

./scripts/build_linux.sh

fails with:

[linux/arm64] [12/16] STEP 12/12: RUN --mount=type=cache,target=/root/.cache/go-build     go build -trimpath -buildmode=pie -o /bin/ollama .
# github.com/ollama/ollama/llama/llama.cpp/src
llama-quant.cpp:656:66: warning: absolute value function 'abs' given an argument of type 'const int64_t' (aka 'const long') but has parameter of type 'int' which may cause truncation of value [-Wabsolute-value]
llama-quant.cpp:656:66: note: use function 'std::abs' instead
--> 7a3e3f1e20a8
Error: 2 errors occurred:
	* [linux/arm64]: determining starting point for build: no FROM statement found
	* [linux/amd64]: determining starting point for build: no FROM statement found

Same result as with current source code. Advice? Looking to play with the Vulkan capability.

Thanks.

OS

Linux

GPU

Intel

CPU

AMD

Ollama version

v0.12.6

Originally created by @zchef2k on GitHub (Oct 29, 2025). ### What is the issue? Downloaded the zipped source from the release page. Made no edits to Dockerfile, build script or envs. ` ./scripts/build_linux.sh ` fails with: ``` [linux/arm64] [12/16] STEP 12/12: RUN --mount=type=cache,target=/root/.cache/go-build go build -trimpath -buildmode=pie -o /bin/ollama . # github.com/ollama/ollama/llama/llama.cpp/src llama-quant.cpp:656:66: warning: absolute value function 'abs' given an argument of type 'const int64_t' (aka 'const long') but has parameter of type 'int' which may cause truncation of value [-Wabsolute-value] llama-quant.cpp:656:66: note: use function 'std::abs' instead --> 7a3e3f1e20a8 Error: 2 errors occurred: * [linux/arm64]: determining starting point for build: no FROM statement found * [linux/amd64]: determining starting point for build: no FROM statement found ``` Same result as with current source code. Advice? Looking to play with the Vulkan capability. Thanks. ### OS Linux ### GPU Intel ### CPU AMD ### Ollama version v0.12.6
GiteaMirror added the bug label 2025-11-12 14:44:31 -06:00
Author
Owner

@zchef2k commented on GitHub (Oct 30, 2025):

Derp.

Closing as docker was aliased to podman and stumbling on syntax. Properly installed docker and build succeeds, discreet GPU detected as expected.

@zchef2k commented on GitHub (Oct 30, 2025): Derp. Closing as `docker` was aliased to `podman` and stumbling on syntax. Properly installed docker and build succeeds, discreet GPU detected as expected.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama-ollama#8515