[GH-ISSUE #7293] 0.4.0rc0 arm64 andro termux compile error #30397

Closed
opened 2026-04-22 09:59:01 -05:00 by GiteaMirror · 13 comments
Owner

Originally created by @fxmbsw7 on GitHub (Oct 21, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7293

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

the earlier worked easy , go generate ./... and go build .

whats the new style ? gcc .c instead go ?
readme doesnt seem to contain about compilement

i run go generate ./... it returns
ill retry w/o go

<rm cmd removed for discord>
make -f make/Makefile.default
make[1]: Entering directory '/data/data/com.termux/files/home/ollama-0.4.0-rc0/llama'
fatal: not a git repository (or any parent up to mount point /)
Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
GOARCH=arm64 go build -buildmode=pie "-ldflags=-w -s \"-X=github.com/ollama/ollama/version.Version=\" \"-X=github.com/ollama/ollama/llama.CpuFeatures=\" " -trimpath   -o /data/data/com.termux/files/home/ollama-0.4.0-rc0/llama/build/linux-arm64/runners/cpu/ollama_llama_server ./runner
# github.com/ollama/ollama/llama
ggml-quants.c:4023:88: error: always_inline function 'vmmlaq_s32' requires target feature 'i8mm', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without support for 'i8mm'
ggml-quants.c:4023:76: error: always_inline function 'vmmlaq_s32' requires target feature 'i8mm', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without support for 'i8mm'
ggml-quants.c:4023:64: error: always_inline function 'vmmlaq_s32' requires target feature 'i8mm', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without support for 'i8mm'
ggml-quants.c:4023:52: error: always_inline function 'vmmlaq_s32' requires target feature 'i8mm', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without support for 'i8mm'
make[1]: *** [make/Makefile.default:27: /data/data/com.termux/files/home/ollama-0.4.0-rc0/llama/build/linux-arm64/runners/cpu/ollama_llama_server] Error 1
make[1]: Leaving directory '/data/data/com.termux/files/home/ollama-0.4.0-rc0/llama'
make: *** [Makefile:41: default] Error 2
llama/llama.go:3: running "make": exit status 2```

### OS

Linux

### GPU

Other

### CPU

Other

### Ollama version

0.4.0rc0
Originally created by @fxmbsw7 on GitHub (Oct 21, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7293 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? the earlier worked easy , `go generate ./...` and `go build .` whats the new style ? gcc .c instead go ? readme doesnt seem to contain about compilement i run `go generate ./...` it returns ill retry w/o go ```~/ollama-0.4.0-rc0 $ go generate ./... <rm cmd removed for discord> make -f make/Makefile.default make[1]: Entering directory '/data/data/com.termux/files/home/ollama-0.4.0-rc0/llama' fatal: not a git repository (or any parent up to mount point /) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set). GOARCH=arm64 go build -buildmode=pie "-ldflags=-w -s \"-X=github.com/ollama/ollama/version.Version=\" \"-X=github.com/ollama/ollama/llama.CpuFeatures=\" " -trimpath -o /data/data/com.termux/files/home/ollama-0.4.0-rc0/llama/build/linux-arm64/runners/cpu/ollama_llama_server ./runner # github.com/ollama/ollama/llama ggml-quants.c:4023:88: error: always_inline function 'vmmlaq_s32' requires target feature 'i8mm', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without support for 'i8mm' ggml-quants.c:4023:76: error: always_inline function 'vmmlaq_s32' requires target feature 'i8mm', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without support for 'i8mm' ggml-quants.c:4023:64: error: always_inline function 'vmmlaq_s32' requires target feature 'i8mm', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without support for 'i8mm' ggml-quants.c:4023:52: error: always_inline function 'vmmlaq_s32' requires target feature 'i8mm', but would be inlined into function 'ggml_vec_dot_q4_0_q8_0' that is compiled without support for 'i8mm' make[1]: *** [make/Makefile.default:27: /data/data/com.termux/files/home/ollama-0.4.0-rc0/llama/build/linux-arm64/runners/cpu/ollama_llama_server] Error 1 make[1]: Leaving directory '/data/data/com.termux/files/home/ollama-0.4.0-rc0/llama' make: *** [Makefile:41: default] Error 2 llama/llama.go:3: running "make": exit status 2``` ### OS Linux ### GPU Other ### CPU Other ### Ollama version 0.4.0rc0
GiteaMirror added the linuxbuildbug labels 2026-04-22 09:59:02 -05:00
Author
Owner

@dhiltgen commented on GitHub (Oct 22, 2024):

I think we'll need to move https://github.com/ollama/ollama/blob/main/llama/llama.go#L37-L38 -D__ARM_FEATURE_MATMUL_INT8 behind a go build tag, and then add some smarts to the makefile to detect arm architecture somehow, then adjust tags accordingly.

<!-- gh-comment-id:2429683963 --> @dhiltgen commented on GitHub (Oct 22, 2024): I think we'll need to move https://github.com/ollama/ollama/blob/main/llama/llama.go#L37-L38 `-D__ARM_FEATURE_MATMUL_INT8` behind a go build tag, and then add some smarts to the makefile to detect arm architecture somehow, then adjust tags accordingly.
Author
Owner

@fxmbsw7 commented on GitHub (Oct 23, 2024):

sounds like whats needed
but plz also make this build feature include more , like the D ARM FMA and really all opts .. one two i did somewhen enable and it showed them then on
( i did enable all i can find , and some made it )

greets ..

<!-- gh-comment-id:2431548739 --> @fxmbsw7 commented on GitHub (Oct 23, 2024): sounds like whats needed but plz also make this build feature include more , like the D ARM FMA and really all opts .. one two i did somewhen enable and it showed them then on ( i did enable all i can find , and some made it ) greets ..
Author
Owner

@yashasnadigsyn commented on GitHub (Nov 1, 2024):

any updates on this? downloading and building ollama in termux gives me the same error as above.

For people coming from google: #7292

<!-- gh-comment-id:2451285762 --> @yashasnadigsyn commented on GitHub (Nov 1, 2024): any updates on this? downloading and building ollama in termux gives me the same error as above. For people coming from google: #7292
Author
Owner

@fxmbsw7 commented on GitHub (Nov 1, 2024):

no
u need , after checking out , delete that int8 word ( till with last space
of last word ) on both mentioned lines
then compile ...

On Fri, Nov 1, 2024, 05:46 Yashas Nadig @.***> wrote:

any updates on this? downloading and building ollama in termux gives me
the same error as above.


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7293#issuecomment-2451285762,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3JX7HXFIGP2HNJB4CLZ6MBQ3AVCNFSM6AAAAABQJ2GMC2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJRGI4DKNZWGI
.
You are receiving this because you authored the thread.Message ID:
@.***>

<!-- gh-comment-id:2451309269 --> @fxmbsw7 commented on GitHub (Nov 1, 2024): no u need , after checking out , delete that int8 word ( till with last space of last word ) on both mentioned lines then compile ... On Fri, Nov 1, 2024, 05:46 Yashas Nadig ***@***.***> wrote: > any updates on this? downloading and building ollama in termux gives me > the same error as above. > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7293#issuecomment-2451285762>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3JX7HXFIGP2HNJB4CLZ6MBQ3AVCNFSM6AAAAABQJ2GMC2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJRGI4DKNZWGI> > . > You are receiving this because you authored the thread.Message ID: > ***@***.***> >
Author
Owner

@fxmbsw7 commented on GitHub (Nov 7, 2024):

i made it work by adding the march 8.6 a but without sve
im looking further into opts

<!-- gh-comment-id:2461040554 --> @fxmbsw7 commented on GitHub (Nov 7, 2024): i made it work by adding the march 8.6 a but without sve im looking further into opts
Author
Owner

@fxmbsw7 commented on GitHub (Nov 10, 2024):

-march=armv8.7-a works well
why dont u add -march=armv8-a or the right equivalent for it on both arm64 linux targets ? then itd work ? no one compiles for v7 as many are 32b .. i think ..

<!-- gh-comment-id:2466831044 --> @fxmbsw7 commented on GitHub (Nov 10, 2024): -march=armv8.7-a works well why dont u add -march=armv8-a or the right equivalent for it on both arm64 linux targets ? then itd work ? no one compiles for v7 as many are 32b .. i think ..
Author
Owner

@olamarvel commented on GitHub (Dec 30, 2024):

Pls contrary to what people said that this fix would make termux work, it doesn't on my phone. I have spent hour trying every fix I found so I decide to seek help.

Was it intentional for armv8l to be unsupported cause I keep getting this.

cmd/interactive.go:4:2: package cmp is not in GOROOT (/usr/lib/go-1.19/src/cmp) envconfig/config.go:5:2: package log/slog is not in GOROOT (/usr/lib/go-1.19/src/log/slog) llama/llama.go:95:2: package slices is not in GOROOT (/usr/lib/go-1.19/src/slices) ../go/pkg/mod/golang.org/x/crypto@v0.31.0/curve25519/curve25519.go:13:8: package crypto/ecdh is not in GOROOT (/usr/lib/go-1.19/src/crypto/ecdh) server/download.go:11:2: package math/rand/v2 is not in GOROOT (/usr/lib/go-1.19/src/math/rand/v2) GOARCH=armv8l go build "-ldflags=-w -s \"-X=github.com/ollama/ollama/version.Version=459d822-dirty\" " -trimpath -o ollama . go: unsupported GOOS/GOARCH pair linux/armv8l make[1]: *** [make/Makefile.ollama:13: ollama] Error 2 make: *** [Makefile:51: exe] Error 2 root@localhost:~/ollama# cd llama root@localhost:~/ollama/llama# nano llama.go root@localhost:~/ollama/llama# cd ../ root@localhost:~/ollama# make cmd/interactive.go:4:2: package cmp is not in GOROOT (/usr/lib/go-1.19/src/cmp) envconfig/config.go:5:2: package log/slog is not in GOROOT (/usr/lib/go-1.19/src/log/slog) llama/llama.go:95:2: package slices is not in GOROOT (/usr/lib/go-1.19/src/slices) ../go/pkg/mod/golang.org/x/crypto@v0.31.0/curve25519/curve25519.go:13:8: package crypto/ecdh is not in GOROOT (/usr/lib/go-1.19/src/crypto/ecdh) server/download.go:11:2: package math/rand/v2 is not in GOROOT (/usr/lib/go-1.19/src/math/rand/v2)

So I don't stress my self any longer.
Thank you.

<!-- gh-comment-id:2565259500 --> @olamarvel commented on GitHub (Dec 30, 2024): Pls contrary to what people said that this fix would make termux work, it doesn't on my phone. I have spent hour trying every fix I found so I decide to seek help. Was it intentional for armv8l to be unsupported cause I keep getting this. > `cmd/interactive.go:4:2: package cmp is not in GOROOT (/usr/lib/go-1.19/src/cmp) envconfig/config.go:5:2: package log/slog is not in GOROOT (/usr/lib/go-1.19/src/log/slog) llama/llama.go:95:2: package slices is not in GOROOT (/usr/lib/go-1.19/src/slices) ../go/pkg/mod/golang.org/x/crypto@v0.31.0/curve25519/curve25519.go:13:8: package crypto/ecdh is not in GOROOT (/usr/lib/go-1.19/src/crypto/ecdh) server/download.go:11:2: package math/rand/v2 is not in GOROOT (/usr/lib/go-1.19/src/math/rand/v2) GOARCH=armv8l go build "-ldflags=-w -s \"-X=github.com/ollama/ollama/version.Version=459d822-dirty\" " -trimpath -o ollama . go: unsupported GOOS/GOARCH pair linux/armv8l make[1]: *** [make/Makefile.ollama:13: ollama] Error 2 make: *** [Makefile:51: exe] Error 2 root@localhost:~/ollama# cd llama root@localhost:~/ollama/llama# nano llama.go root@localhost:~/ollama/llama# cd ../ root@localhost:~/ollama# make cmd/interactive.go:4:2: package cmp is not in GOROOT (/usr/lib/go-1.19/src/cmp) envconfig/config.go:5:2: package log/slog is not in GOROOT (/usr/lib/go-1.19/src/log/slog) llama/llama.go:95:2: package slices is not in GOROOT (/usr/lib/go-1.19/src/slices) ../go/pkg/mod/golang.org/x/crypto@v0.31.0/curve25519/curve25519.go:13:8: package crypto/ecdh is not in GOROOT (/usr/lib/go-1.19/src/crypto/ecdh) server/download.go:11:2: package math/rand/v2 is not in GOROOT (/usr/lib/go-1.19/src/math/rand/v2)` So I don't stress my self any longer. Thank you.
Author
Owner

@fxmbsw7 commented on GitHub (Dec 30, 2024):

it looks some packages are missing
plz try my script , it includes some go get cmds and recursive get's
just a try .. im no go pro ..

http://0x0.st/8sgT.sh

<!-- gh-comment-id:2565376112 --> @fxmbsw7 commented on GitHub (Dec 30, 2024): it looks some packages are missing plz try my script , it includes some go get cmds and recursive get's just a try .. im no go pro .. http://0x0.st/8sgT.sh
Author
Owner

@fxmbsw7 commented on GitHub (Dec 30, 2024):

i dunno arm8l , i use arm 8.7 a in the script

<!-- gh-comment-id:2565377558 --> @fxmbsw7 commented on GitHub (Dec 30, 2024): i dunno arm8l , i use arm 8.7 a in the script
Author
Owner

@olamarvel commented on GitHub (Dec 30, 2024):

it looks some packages are missing
plz try my script , it includes some go get cmds and recursive get's
just a try .. im no go pro ..

http://0x0.st/8sgT.sh

Ok I would try it.
Thanks

<!-- gh-comment-id:2565394122 --> @olamarvel commented on GitHub (Dec 30, 2024): > it looks some packages are missing > plz try my script , it includes some go get cmds and recursive get's > just a try .. im no go pro .. > > http://0x0.st/8sgT.sh Ok I would try it. Thanks
Author
Owner

@olamarvel commented on GitHub (Dec 30, 2024):

it looks some packages are missing
plz try my script , it includes some go get cmds and recursive get's
just a try .. im no go pro ..

http://0x0.st/8sgT.sh

I go
`error: 5636 bytes of body are still expected
fetch-pack: unexpected disconnect while reading sideband packet
fatal: early EOF
fatal: fetch-pack: invalid index-pack output
8sgT.sh: line 19: cd: /data/data/com.termux/files/home/ollama.3336986989: No such file or directory

err : cant cd into '/data/data/com.termux/files/home/ollama.3336986989'
ran for 105 seconds .. bye ..`

<!-- gh-comment-id:2565404143 --> @olamarvel commented on GitHub (Dec 30, 2024): > it looks some packages are missing > plz try my script , it includes some go get cmds and recursive get's > just a try .. im no go pro .. > > http://0x0.st/8sgT.sh I go `error: 5636 bytes of body are still expected fetch-pack: unexpected disconnect while reading sideband packet fatal: early EOF fatal: fetch-pack: invalid index-pack output 8sgT.sh: line 19: cd: /data/data/com.termux/files/home/ollama.3336986989: No such file or directory err : cant cd into '/data/data/com.termux/files/home/ollama.3336986989' ran for 105 seconds .. bye ..`
Author
Owner

@fxmbsw7 commented on GitHub (Dec 30, 2024):

looks like a git incomplete error ?

On Mon, 30 Dec 2024, 13:13 Olatunde Marvelous, @.***>
wrote:

it looks some packages are missing
plz try my script , it includes some go get cmds and recursive get's
just a try .. im no go pro ..

http://0x0.st/8sgT.sh

I go
`error: 5636 bytes of body are still expected
fetch-pack: unexpected disconnect while reading sideband packet
fatal: early EOF
fatal: fetch-pack: invalid index-pack output
8sgT.sh: line 19: cd: /data/data/com.termux/files/home/ollama.3336986989:
No such file or directory

err : cant cd into '/data/data/com.termux/files/home/ollama.3336986989'
ran for 105 seconds .. bye ..`


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/7293#issuecomment-2565404143,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AJMLP3KWN72YXCLFARN337D2IE2FJAVCNFSM6AAAAABQJ2GMC2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNRVGQYDIMJUGM
.
You are receiving this because you authored the thread.Message ID:
@.***>

<!-- gh-comment-id:2565709842 --> @fxmbsw7 commented on GitHub (Dec 30, 2024): looks like a git incomplete error ? On Mon, 30 Dec 2024, 13:13 Olatunde Marvelous, ***@***.***> wrote: > it looks some packages are missing > plz try my script , it includes some go get cmds and recursive get's > just a try .. im no go pro .. > > http://0x0.st/8sgT.sh > > I go > `error: 5636 bytes of body are still expected > fetch-pack: unexpected disconnect while reading sideband packet > fatal: early EOF > fatal: fetch-pack: invalid index-pack output > 8sgT.sh: line 19: cd: /data/data/com.termux/files/home/ollama.3336986989: > No such file or directory > > err : cant cd into '/data/data/com.termux/files/home/ollama.3336986989' > ran for 105 seconds .. bye ..` > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/7293#issuecomment-2565404143>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AJMLP3KWN72YXCLFARN337D2IE2FJAVCNFSM6AAAAABQJ2GMC2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNRVGQYDIMJUGM> > . > You are receiving this because you authored the thread.Message ID: > ***@***.***> >
Author
Owner

@fxmbsw7 commented on GitHub (Dec 31, 2024):

well it looks like git error
if u get past that , with my script , u gotta edit the scripts arm87a to ur ' l '

<!-- gh-comment-id:2566541322 --> @fxmbsw7 commented on GitHub (Dec 31, 2024): well it looks like git error if u get past that , with my script , u gotta edit the scripts arm87a to ur ' l '
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#30397