[GH-ISSUE #1090] Suggestions for instruction clarifications for running in docker in Windows. #62577

Closed
opened 2026-05-03 09:36:06 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @pdavis68 on GitHub (Nov 11, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1090

Originally assigned to: @dhiltgen on GitHub.

I just got this installed in Windows using Docker.

The instructions were a bit unclear since the instructions for installing the Nvidia stuff is Linux based. I mistakenly thought I needed to run the container and install all the Nvidia stuff in the container. . It might help other people like me who aren't so clever to know that those are Linux-specific instructions for installing the Nvidia drivers and adding instructions for installing the Nvidia drivers under Windows.

For Windows, you can install the Nvidia drivers (though I'm not sure which ones I installed that worked for this, because I've had them installed for a while. I have the basic drivers and CUDA. I'm guessing it's using the CUDA drivers.) and then run it with the GPU as per the instructions, it worked like a charm.

Also, just want to say, really nice work. I've tried installing a few of these local LLMs and this was by far the easiest for me to install (despite the above issues) and it works really well. I couldn't be happier.

I've been working on a game that uses LLMs and the cost of running with OpenAI was going to be higher than I'd like, so I've been waiting for a local version I could use instead and you guys have delivered. I'm really excited about this.

Originally created by @pdavis68 on GitHub (Nov 11, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1090 Originally assigned to: @dhiltgen on GitHub. I just got this installed in Windows using Docker. The instructions were a bit unclear since the instructions for installing the Nvidia stuff is Linux based. I mistakenly thought I needed to run the container and install all the Nvidia stuff in the container. . It might help other people like me who aren't so clever to know that those are Linux-specific instructions for installing the Nvidia drivers and adding instructions for installing the Nvidia drivers under Windows. For Windows, you can install the Nvidia drivers (though I'm not sure which ones I installed that worked for this, because I've had them installed for a while. I have the basic drivers and CUDA. I'm guessing it's using the CUDA drivers.) and then run it with the GPU as per the instructions, it worked like a charm. Also, just want to say, really nice work. I've tried installing a few of these local LLMs and this was by far the easiest for me to install (despite the above issues) and it works really well. I couldn't be happier. I've been working on a game that uses LLMs and the cost of running with OpenAI was going to be higher than I'd like, so I've been waiting for a local version I could use instead and you guys have delivered. I'm really excited about this.
GiteaMirror added the documentation label 2026-05-03 09:36:06 -05:00
Author
Owner

@dhiltgen commented on GitHub (Mar 12, 2024):

I'm inclined to close this one now that we have a native Windows version available that supports both Nvidia and AMD GPUs. For most users, that's going to be a far better option than WSL2. WSL2 still works and we'll continue to support it for folks who want to use it, but the simple flow is just run OllamaSetup.exe and as long as you've installed the Nvidia or AMD driver for your GPU, and have a supported GPU, it should just work.

<!-- gh-comment-id:1991961688 --> @dhiltgen commented on GitHub (Mar 12, 2024): I'm inclined to close this one now that we have a native Windows version available that supports both Nvidia and AMD GPUs. For most users, that's going to be a far better option than WSL2. WSL2 still works and we'll continue to support it for folks who want to use it, but the simple flow is just run `OllamaSetup.exe` and as long as you've installed the Nvidia or AMD driver for your GPU, and have a supported GPU, it should just work.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#62577