[GH-ISSUE #2520] go-1.21 fails to build ollama: C source files not allowed when not using cgo or SWIG: gpu_info_cpu.c gpu_info_cuda.c gpu_info_rocm.c #63513

Closed
opened 2026-05-03 13:57:52 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @yurivict on GitHub (Feb 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2520

===>  Building for ollama-0.1.25
(cd /usr/ports/misc/ollama/work/github.com/ollama/ollama@v0.1.25;  for t in ./cmd; do  out=$(/usr/bin/basename $(echo ${t} |  /usr/bin/sed -Ee 's/^[^:]*:([^:]+).*$/\1/' -e 's/^\.$/ollama/'));  pkg=$(echo ${t} |  /usr/bin/sed -Ee 's/^([^:]*).*$/\1/' -e 's/^ollama$/./');  echo "===>  Building ${out} from ${pkg}";  /usr/bin/env XDG_DATA_HOME=/usr/ports/misc/ollama/work  XDG_CONFIG_HOME=/usr/ports/misc/ollama/work  XDG_CACHE_HOME=/usr/ports/misc/ollama/work/.cache  HOME=/usr/ports/misc/ollama/work PATH=/usr/local/libexec/ccache:/usr/ports/misc/ollama/work/.bin:/home/yuri/.cargo/bin:/home/yuri/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin PKG_CONFIG_LIBDIR=/usr/ports/misc/ollama/work/.pkgconfig:/usr/local/libdata/pkgconfig:/usr/local/share/pkgconfig:/usr/libdata/pkgconfig MK_DEBUG_FILES=no MK_KERNEL_SYMBOLS=no SHELL=/bin/sh NO_LINT=YES PREFIX=/usr/local  LOCALBASE=/usr/local  CC="cc" CFLAGS="-O2 -pipe  -fstack-protector-strong -fno-strict-aliasing "  CPP="cpp" CPPFLAGS=""  LDFLAGS=" -fstack-protector-strong " LIBS=""  CXX="c++" CXXFLAGS="-O2 -pipe -fstack-protector-strong -fno-strict-aliasing  " CCACHE_DIR="/tmp/.ccache" BSD_INSTALL_PROGRAM="install  -s -m 555"  BSD_INSTALL_LIB="install  -s -m 0644"  BSD_INSTALL_SCRIPT="install  -m 555"  BSD_INSTALL_DATA="install  -m 0644"  BSD_INSTALL_MAN="install  -m 444" CGO_ENABLED=1  CGO_CFLAGS="-I/usr/local/include"  CGO_LDFLAGS="-L/usr/local/lib"  GOAMD64=  GOARM=  GOTMPDIR="/usr/ports/misc/ollama/work" GOPATH="/usr/ports/distfiles/go/misc_ollama"  GOBIN="/usr/ports/misc/ollama/work/bin"  GO111MODULE=on  GOFLAGS=-modcacherw  GOSUMDB=sum.golang.org GOMAXPROCS=7 GOPROXY=off /usr/local/bin/go121 build -buildmode=exe -v -trimpath -ldflags=-s -buildvcs=false -mod=vendor  -o /usr/ports/misc/ollama/work/bin/${out}  ${pkg};  done)
===>  Building cmd from ./cmd
package github.com/jmorganca/ollama/cmd
        imports github.com/jmorganca/ollama/server
        imports github.com/jmorganca/ollama/gpu: C source files not allowed when not using cgo or SWIG: gpu_info_cpu.c gpu_info_cuda.c gpu_info_rocm.c
*** Error code 1
Originally created by @yurivict on GitHub (Feb 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2520 ``` ===> Building for ollama-0.1.25 (cd /usr/ports/misc/ollama/work/github.com/ollama/ollama@v0.1.25; for t in ./cmd; do out=$(/usr/bin/basename $(echo ${t} | /usr/bin/sed -Ee 's/^[^:]*:([^:]+).*$/\1/' -e 's/^\.$/ollama/')); pkg=$(echo ${t} | /usr/bin/sed -Ee 's/^([^:]*).*$/\1/' -e 's/^ollama$/./'); echo "===> Building ${out} from ${pkg}"; /usr/bin/env XDG_DATA_HOME=/usr/ports/misc/ollama/work XDG_CONFIG_HOME=/usr/ports/misc/ollama/work XDG_CACHE_HOME=/usr/ports/misc/ollama/work/.cache HOME=/usr/ports/misc/ollama/work PATH=/usr/local/libexec/ccache:/usr/ports/misc/ollama/work/.bin:/home/yuri/.cargo/bin:/home/yuri/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/local/sbin:/usr/local/bin PKG_CONFIG_LIBDIR=/usr/ports/misc/ollama/work/.pkgconfig:/usr/local/libdata/pkgconfig:/usr/local/share/pkgconfig:/usr/libdata/pkgconfig MK_DEBUG_FILES=no MK_KERNEL_SYMBOLS=no SHELL=/bin/sh NO_LINT=YES PREFIX=/usr/local LOCALBASE=/usr/local CC="cc" CFLAGS="-O2 -pipe -fstack-protector-strong -fno-strict-aliasing " CPP="cpp" CPPFLAGS="" LDFLAGS=" -fstack-protector-strong " LIBS="" CXX="c++" CXXFLAGS="-O2 -pipe -fstack-protector-strong -fno-strict-aliasing " CCACHE_DIR="/tmp/.ccache" BSD_INSTALL_PROGRAM="install -s -m 555" BSD_INSTALL_LIB="install -s -m 0644" BSD_INSTALL_SCRIPT="install -m 555" BSD_INSTALL_DATA="install -m 0644" BSD_INSTALL_MAN="install -m 444" CGO_ENABLED=1 CGO_CFLAGS="-I/usr/local/include" CGO_LDFLAGS="-L/usr/local/lib" GOAMD64= GOARM= GOTMPDIR="/usr/ports/misc/ollama/work" GOPATH="/usr/ports/distfiles/go/misc_ollama" GOBIN="/usr/ports/misc/ollama/work/bin" GO111MODULE=on GOFLAGS=-modcacherw GOSUMDB=sum.golang.org GOMAXPROCS=7 GOPROXY=off /usr/local/bin/go121 build -buildmode=exe -v -trimpath -ldflags=-s -buildvcs=false -mod=vendor -o /usr/ports/misc/ollama/work/bin/${out} ${pkg}; done) ===> Building cmd from ./cmd package github.com/jmorganca/ollama/cmd imports github.com/jmorganca/ollama/server imports github.com/jmorganca/ollama/gpu: C source files not allowed when not using cgo or SWIG: gpu_info_cpu.c gpu_info_cuda.c gpu_info_rocm.c *** Error code 1 ```
Author
Owner

@mxyng commented on GitHub (Feb 15, 2024):

Judging by your bio, I'm assuming this output is from an FreeBSD build which is not currently supported.

<!-- gh-comment-id:1947237655 --> @mxyng commented on GitHub (Feb 15, 2024): Judging by your bio, I'm assuming this output is from an FreeBSD build which is not currently supported.
Author
Owner

@yurivict commented on GitHub (Feb 15, 2024):

This is on FreeBSD - I am trying to create the FreeBSD port.

<!-- gh-comment-id:1947240476 --> @yurivict commented on GitHub (Feb 15, 2024): This is on FreeBSD - I am trying to create the FreeBSD port.
Author
Owner

@mxyng commented on GitHub (Feb 15, 2024):

It's missing a build target for freebsd. See gpu.go and gpu_darwin.go

<!-- gh-comment-id:1947255286 --> @mxyng commented on GitHub (Feb 15, 2024): It's missing a build target for freebsd. See [gpu.go](https://github.com/ollama/ollama/blob/main/gpu/gpu.go) and [gpu_darwin.go](https://github.com/ollama/ollama/blob/main/gpu/gpu_darwin.go)
Author
Owner

@cortexmancer commented on GitHub (Feb 18, 2024):

This is on FreeBSD - I am trying to create the FreeBSD port.

Maybe this will only work with some kind of linuxmulator since FreeBSD does not have implemented CUDA.

<!-- gh-comment-id:1951396852 --> @cortexmancer commented on GitHub (Feb 18, 2024): > This is on FreeBSD - I am trying to create the FreeBSD port. Maybe this will only work with some kind of linuxmulator since FreeBSD does not have implemented CUDA.
Author
Owner

@blacklightpy commented on GitHub (Apr 10, 2024):

@dostoievsky shouldn't it work without GPUs too?

<!-- gh-comment-id:2046538802 --> @blacklightpy commented on GitHub (Apr 10, 2024): @dostoievsky shouldn't it work without GPUs too?
Author
Owner

@cortexmancer commented on GitHub (Apr 11, 2024):

Hello. I think CPU implementation is new, and was released after thoose first comments.
I didnt checked on it yet, but I think it will work.

EDIT:
Well, there is no instructions for building with CPU only.
I only found reference for that with the docker container, which I am trying right now with bhyve.

It is there a way to setup manual building with cpu only?

<!-- gh-comment-id:2048621907 --> @cortexmancer commented on GitHub (Apr 11, 2024): Hello. I think CPU implementation is new, and was released after thoose first comments. I didnt checked on it yet, but I think it will work. EDIT: Well, there is no instructions for building with CPU only. I only found reference for that with the docker container, which I am trying right now with bhyve. It is there a way to setup manual building with cpu only?
Author
Owner

@Tycho-S commented on GitHub (Apr 18, 2024):

I was also trying to build it on FreeBSD (on a machine without GPU but lots of memory) and I ran into the same issue. Came here to find this. That explains a lot.

It would be great if there was a CPU only building flag.

<!-- gh-comment-id:2065343322 --> @Tycho-S commented on GitHub (Apr 18, 2024): I was also trying to build it on FreeBSD (on a machine without GPU but lots of memory) and I ran into the same issue. Came here to find this. That explains a lot. It would be great if there was a CPU only building flag.
Author
Owner

@yurivict commented on GitHub (Apr 18, 2024):

I am trying to create the FreeBSD port for ollama, and this issue is a show-stopper.

<!-- gh-comment-id:2065359791 --> @yurivict commented on GitHub (Apr 18, 2024): I am trying to create the FreeBSD port for ollama, and this issue is a show-stopper.
Author
Owner

@cortexmancer commented on GitHub (Apr 18, 2024):

@yurivict have you checked https://github.com/ggerganov/llama.cpp ?
It got FreeBSD instructions and I got it running on my machine with cpu acelleration.
It has alot of options, cuda or other ways.

<!-- gh-comment-id:2065415709 --> @cortexmancer commented on GitHub (Apr 18, 2024): @yurivict have you checked https://github.com/ggerganov/llama.cpp ? It got FreeBSD instructions and I got it running on my machine with cpu acelleration. It has alot of options, cuda or other ways.
Author
Owner

@yurivict commented on GitHub (Apr 18, 2024):

@dostoievsky

We have the FreeBSD port for llama-cpp for a while: https://cgit.freebsd.org/ports/tree/misc/llama-cpp

Thank you,
Yuri

<!-- gh-comment-id:2065418514 --> @yurivict commented on GitHub (Apr 18, 2024): @dostoievsky We have the FreeBSD port for llama-cpp for a while: https://cgit.freebsd.org/ports/tree/misc/llama-cpp Thank you, Yuri
Author
Owner

@dhiltgen commented on GitHub (May 2, 2024):

Let's close this as a dup of #1102

<!-- gh-comment-id:2091816791 --> @dhiltgen commented on GitHub (May 2, 2024): Let's close this as a dup of #1102
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#63513