[GH-ISSUE #2002] how to enable amd gpu for ollama ? #26916

Closed
opened 2026-04-22 03:39:42 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @hemangjoshi37a on GitHub (Jan 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2002

how to enable amd gpu for ollama ?

Originally created by @hemangjoshi37a on GitHub (Jan 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2002 how to enable amd gpu for ollama ?
Author
Owner

@jmorganca commented on GitHub (Jan 15, 2024):

Hi there, AMD and ROCm support aren't yet in the released version – I'll merge this with #738 so we can follow along there!

<!-- gh-comment-id:1892223407 --> @jmorganca commented on GitHub (Jan 15, 2024): Hi there, AMD and ROCm support aren't yet in the released version – I'll merge this with #738 so we can follow along there!
Author
Owner

@hemangjoshi37a commented on GitHub (Jan 15, 2024):

@jmorganca I hope AMD and ROCm get support ASAP because I know so many of my friends that have AMD GPU and wanting to run on their PCs. THanks

This here is a good starting point : https://community.amd.com/t5/ai/how-to-running-optimized-llama2-with-microsoft-directml-on-amd/ba-p/645190

Also if possible for Intel Arc GPUs is a cherry on the top.

<!-- gh-comment-id:1892225912 --> @hemangjoshi37a commented on GitHub (Jan 15, 2024): @jmorganca I hope AMD and ROCm get support ASAP because I know so many of my friends that have AMD GPU and wanting to run on their PCs. THanks This here is a good starting point : https://community.amd.com/t5/ai/how-to-running-optimized-llama2-with-microsoft-directml-on-amd/ba-p/645190 Also if possible for Intel Arc GPUs is a cherry on the top.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26916