[GH-ISSUE #12720] Support for Intel Arc GPU #8441

Closed
opened 2026-04-12 21:07:12 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @kyjar on GitHub (Oct 21, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12720

As a newbie I have finally made Ollama work on my Linux rig, but I realize that support of my Intel Arc B580 is not yet implemented in the latest release.
I would be very greatful if this suggestion will be considered. Thanks.
Br. Kyjar

Originally created by @kyjar on GitHub (Oct 21, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12720 As a newbie I have finally made Ollama work on my Linux rig, but I realize that support of my Intel Arc B580 is not yet implemented in the latest release. I would be very greatful if this suggestion will be considered. Thanks. Br. Kyjar
GiteaMirror added the feature request label 2026-04-12 21:07:12 -05:00
Author
Owner

@pdevine commented on GitHub (Oct 21, 2025):

Going to close this as a dupe of #1590. It's coming soon!

You should be able to get this to work if you build from source right now. From the release notes:

Experimental support for Vulkan is now available when you build locally from source. This will enable additional GPUs from AMD, and Intel which are not currently supported by Ollama. To build locally, install the Vulkan SDK and set VULKAN_SDK in your environment, then follow the developer instructions. In a future release, Vulkan support will be included in the binary release as well. Please file issues if you run into any problems.

<!-- gh-comment-id:3428997190 --> @pdevine commented on GitHub (Oct 21, 2025): Going to close this as a dupe of #1590. It's coming soon! You should be able to get this to work if you build from source right now. From the release notes: > Experimental support for Vulkan is now available when you build locally from source. This will enable additional GPUs from AMD, and Intel which are not currently supported by Ollama. To build locally, install the [Vulkan SDK](https://vulkan.lunarg.com/) and set VULKAN_SDK in your environment, then follow the [developer instructions](https://github.com/ollama/ollama/blob/main/docs/development.md). In a future release, Vulkan support will be included in the binary release as well. Please file issues if you run into any problems.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8441