[GH-ISSUE #11792] Feature Request: Support for Intel Arc B580 GPU Acceleration #33581

Open
opened 2026-04-22 16:26:23 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @siddy81 on GitHub (Aug 7, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11792

Originally assigned to: @dhiltgen on GitHub.

Hi Ollama Team,

I’d like to request official support for GPU acceleration using Intel Arc B580 graphics cards. At the moment, Ollama appears to fall back to CPU execution, even when a capable Intel Arc GPU is available.

Motivation:

Running models on the CPU significantly limits performance and efficiency. The Intel Arc B580 offers strong compute capabilities and is well-suited for modern AI workloads. Additionally, the Arc B580 is part of Intel’s broader push into the GPU market and shows great potential—not only in terms of current hardware performance but also in shaping Intel’s long-term role in the graphics and AI acceleration space.

By supporting Intel Arc GPUs, Ollama could benefit from improved performance for users and align itself with a growing and promising hardware ecosystem.

Thanks for considering this request!

Originally created by @siddy81 on GitHub (Aug 7, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11792 Originally assigned to: @dhiltgen on GitHub. Hi Ollama Team, I’d like to request official support for GPU acceleration using Intel Arc B580 graphics cards. At the moment, Ollama appears to fall back to CPU execution, even when a capable Intel Arc GPU is available. Motivation: Running models on the CPU significantly limits performance and efficiency. The Intel Arc B580 offers strong compute capabilities and is well-suited for modern AI workloads. Additionally, the Arc B580 is part of Intel’s broader push into the GPU market and shows great potential—not only in terms of current hardware performance but also in shaping Intel’s long-term role in the graphics and AI acceleration space. By supporting Intel Arc GPUs, Ollama could benefit from improved performance for users and align itself with a growing and promising hardware ecosystem. Thanks for considering this request!
GiteaMirror added the intelfeature request labels 2026-04-22 16:26:23 -05:00
Author
Owner

@nickhighland commented on GitHub (Aug 18, 2025):

The B580 and A770 are arguably the best value in GPUs right now, and represent a growing and competitive market. Offering support for these GPUs would make Ollama more financially accessible to a broader range of users.

<!-- gh-comment-id:3197565881 --> @nickhighland commented on GitHub (Aug 18, 2025): The B580 and A770 are arguably the best value in GPUs right now, and represent a growing and competitive market. Offering support for these GPUs would make Ollama more financially accessible to a broader range of users.
Author
Owner

@FilipLaurentiu commented on GitHub (Sep 4, 2025):

I think they could at least support this GPU by using the Vulkan API, so it will be just a matter of changing the installer to detect if Vulkan API is supported, or OpenCL.
Using OneAPI (intel proprietary) I assume takes time

<!-- gh-comment-id:3253210947 --> @FilipLaurentiu commented on GitHub (Sep 4, 2025): I think they could at least support this GPU by using the Vulkan API, so it will be just a matter of changing the installer to detect if Vulkan API is supported, or OpenCL. Using OneAPI (intel proprietary) I assume takes time
Author
Owner

@MaZe3D commented on GitHub (Jan 16, 2026):

Support would also be interesting for the ARC Pro B60 with its 24GB of VRAM and comparatively low price-point.

<!-- gh-comment-id:3760264033 --> @MaZe3D commented on GitHub (Jan 16, 2026): Support would also be interesting for the ARC Pro B60 with its 24GB of VRAM and comparatively low price-point.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#33581