[GH-ISSUE #7035] Support AMD GPUs via WSL #66518

Closed
opened 2026-05-04 07:15:49 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @vignessh on GitHub (Sep 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7035

Hello,

I'm running a Windows 11 workstation based on an AMD RX 7900XTX GPU. I installed the latest Ollama for Windows and with that I can see the GPU getting used for any queries. I also tried the Linux install for WSL following this guide. With WSL however, even after installing ROCm as mentioned in the guide, I can see Ollama not making use of the GPU. I tried following this issue to build Olama locally with the fix mentioned, but no success there either. The ollama server keeps crashing when I attempt to run any model like llama2 or llama3.2.

Please can someone help with this. I really don't want to switch to a Nvidia just for this purpose.

Thanks,
Vignessh

Originally created by @vignessh on GitHub (Sep 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7035 Hello, I'm running a Windows 11 workstation based on an AMD RX 7900XTX GPU. I installed the latest Ollama for Windows and with that I can see the GPU getting used for any queries. I also tried the Linux install for WSL following [this](https://community.amd.com/t5/ai/running-llms-locally-on-amd-gpus-with-ollama/ba-p/713266) guide. With WSL however, even after installing ROCm as mentioned in the guide, I can see Ollama not making use of the GPU. I tried following [this](https://github.com/ollama/ollama/issues/5275) issue to build Olama locally with the fix mentioned, but no success there either. The ollama server keeps crashing when I attempt to run any model like `llama2` or `llama3.2`. Please can someone help with this. I really don't want to switch to a Nvidia just for this purpose. Thanks, Vignessh
GiteaMirror added the amdwslwindowsfeature request labels 2026-05-04 07:15:51 -05:00
Author
Owner

@dhiltgen commented on GitHub (Sep 30, 2024):

We're tracking this with #5275

<!-- gh-comment-id:2383714881 --> @dhiltgen commented on GitHub (Sep 30, 2024): We're tracking this with #5275
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66518