[GH-ISSUE #12049] Can't build on Windows even after following instructions #54515

Closed
opened 2026-04-29 06:14:06 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @xodiumluma on GitHub (Aug 23, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12049

What is the issue?

Hi, thanks for your hard work on this project.
I tried all the points in the dev docs, and downloaded the TDM-GCC compiler, added to $Env:PATH.
I made sure I had Windows prerequisites.
Cmake ok.
But when I run go run . serve I have the problems as follows (see log output).

Relevant log output

# github.com/ollama/ollama/discover
discover\amd_windows.go:115:21: undefined: rocmMinimumMemory
discover\amd_windows.go:125:57: undefined: IGPUMemLimit
discover\amd_windows.go:128:29: undefined: unsupportedGPUs
discover\amd_windows.go:139:29: undefined: unsupportedGPUs

OS

Windows

GPU

Intel

CPU

Intel

Ollama version

Can't build executable from GH source so cannot provide version

THANKSSS

Originally created by @xodiumluma on GitHub (Aug 23, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12049 ### What is the issue? Hi, thanks for your hard work on this project. I tried all the points in the [dev docs](https://github.com/ollama/ollama/blob/main/docs/development.md), and downloaded the TDM-GCC compiler, added to `$Env:PATH`. I made sure I had Windows prerequisites. Cmake ok. But when I run `go run . serve` I have the problems as follows (see log output). ### Relevant log output ```shell # github.com/ollama/ollama/discover discover\amd_windows.go:115:21: undefined: rocmMinimumMemory discover\amd_windows.go:125:57: undefined: IGPUMemLimit discover\amd_windows.go:128:29: undefined: unsupportedGPUs discover\amd_windows.go:139:29: undefined: unsupportedGPUs ``` ### OS Windows ### GPU Intel ### CPU Intel ### Ollama version Can't build executable from GH source so cannot provide version ### THANKSSS
GiteaMirror added the bug label 2026-04-29 06:14:06 -05:00
Author
Owner

@Paul-Grant2000 commented on GitHub (Aug 23, 2025):

It seems like a conflict between GUP and the Ollama version.

<!-- gh-comment-id:3217033307 --> @Paul-Grant2000 commented on GitHub (Aug 23, 2025): It seems like a conflict between GUP and the Ollama version.
Author
Owner

@xodiumluma commented on GitHub (Aug 23, 2025):

Ok finally got it working, I didn't know that tdm64-gcc-10.3.0-2.exe was actually archive, so I clicked it, added bin folder to $Env:PATH, cmake with VS 2022 and go run . serve it works nice.

THANK YOU, sorry for trouble.

<!-- gh-comment-id:3217048058 --> @xodiumluma commented on GitHub (Aug 23, 2025): Ok finally got it working, I didn't know that `tdm64-gcc-10.3.0-2.exe` was actually archive, so I clicked it, added `bin` folder to `$Env:PATH`, `cmake` with VS 2022 and `go run . serve` it works nice. THANK YOU, sorry for trouble.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#54515