[GH-ISSUE #1436] Windows binary race condition #767

Closed
opened 2026-04-12 10:27:03 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @csaben on GitHub (Dec 8, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1436

Cross compiled a working ollama.exe for windows but meet the following error when running a model.

./ollama.exe run llama2

Errors with

Error: Post "http://127.0.0.1:11434/api/generate": read tcp 127.0.0.1:52248->127.0.0.1:11434: wsarecv: An existing connection was forcibly closed by the remote host.

The terminal running ./ollama.exe serve closes the connection with the following error,

2023/12/08 11:27:55 llama.go:143: gguf runner not found

These models work fine on this windows machine in WSL so I don't think it has to do with CPU architecture issues as mentioned here

Originally created by @csaben on GitHub (Dec 8, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1436 Cross compiled a working ollama.exe for windows but meet the following error when running a model. ```zsh ./ollama.exe run llama2 ``` Errors with ``` Error: Post "http://127.0.0.1:11434/api/generate": read tcp 127.0.0.1:52248->127.0.0.1:11434: wsarecv: An existing connection was forcibly closed by the remote host. ``` The terminal running `./ollama.exe serve` closes the connection with the following error, ``` 2023/12/08 11:27:55 llama.go:143: gguf runner not found ``` These models work fine on this windows machine in WSL so I don't think it has to do with CPU architecture issues as mentioned [here](https://github.com/jmorganca/ollama/issues/630)
Author
Owner

@BruceMacD commented on GitHub (Dec 12, 2023):

Did you run go generate ./... from the root of the project before running go build .? The error message you're seeing here looks like when the LLM runner isn't packaged into the executable, which that generate command should do.

<!-- gh-comment-id:1852330444 --> @BruceMacD commented on GitHub (Dec 12, 2023): Did you run `go generate ./...` from the root of the project before running `go build .`? The error message you're seeing here looks like when the LLM runner isn't packaged into the executable, which that `generate` command should do.
Author
Owner

@dhiltgen commented on GitHub (Jan 27, 2024):

While not supported yet, the windows native build is coming along, and we're continuing to improve it. I'm going to close this issue now as I don't think it's relevant given the current state of main. @csaben if you run into problems building native on windows please let us know on Discord, or feel free to file a new issue.

<!-- gh-comment-id:1912908396 --> @dhiltgen commented on GitHub (Jan 27, 2024): While not supported yet, the windows native build is coming along, and we're continuing to improve it. I'm going to close this issue now as I don't think it's relevant given the current state of main. @csaben if you run into problems building native on windows please let us know on Discord, or feel free to file a new issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#767