[GH-ISSUE #1548] When is the Windows Version of Ollama Coming out? #845

Closed
opened 2026-04-12 10:30:34 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @Arnav3241 on GitHub (Dec 15, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1548

Hey there,
When is the Windows Version of Ollama Coming out?
I am several hundreds of people are waiting for it eagerly including since the beginning of this project. Hope to see that update soon as most people use Windows.

Originally created by @Arnav3241 on GitHub (Dec 15, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1548 Hey there, When is the Windows Version of Ollama Coming out? I am several hundreds of people are waiting for it eagerly including since the beginning of this project. Hope to see that update soon as most people use Windows.
Author
Owner

@Titaniumtown commented on GitHub (Dec 16, 2023):

Can't you just use WSL? I haven't used windows in a little over a decade now, wouldn't that work?

<!-- gh-comment-id:1858730146 --> @Titaniumtown commented on GitHub (Dec 16, 2023): Can't you just use WSL? I haven't used windows in a little over a decade now, wouldn't that work?
Author
Owner

@mongolu commented on GitHub (Dec 16, 2023):

It works, and it works very well.
🥂👍😀

<!-- gh-comment-id:1858793040 --> @mongolu commented on GitHub (Dec 16, 2023): It works, and it works very well. 🥂👍😀
Author
Owner

@easp commented on GitHub (Dec 17, 2023):

I don't know the answer, but I think they are working on getting foundations in shape before enabling a new platform.

One thing a maintainer mentioned recently is that they wanted ROCm support before releasing a Windows version since there are so many machines out there with AMD GPUs -- I assume they want people to have a good experience and also not get inundated by half of Windows users complaining about slow text generation performance (probably).

<!-- gh-comment-id:1858994797 --> @easp commented on GitHub (Dec 17, 2023): I don't know the answer, but I think they are working on getting foundations in shape before enabling a new platform. One thing a maintainer mentioned recently is that they wanted ROCm support before releasing a Windows version since there are so many machines out there with AMD GPUs -- I assume they want people to have a good experience and also not get inundated by half of Windows users complaining about slow text generation performance (probably).
Author
Owner

@technovangelist commented on GitHub (Dec 19, 2023):

What @easp said is exactly correct. Issue #403 is probably the best one to watch for this release. I will go ahead and close this issue now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.

<!-- gh-comment-id:1863358307 --> @technovangelist commented on GitHub (Dec 19, 2023): What @easp said is exactly correct. Issue #403 is probably the best one to watch for this release. I will go ahead and close this issue now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.
Author
Owner

@Arnav3241 commented on GitHub (Mar 2, 2024):

I has finally Came, and no i couldnt have used WSL as i had to make an auto py prog with it

<!-- gh-comment-id:1974810016 --> @Arnav3241 commented on GitHub (Mar 2, 2024): I has finally Came, and no i couldnt have used WSL as i had to make an auto py prog with it
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#845