[GH-ISSUE #6588] Intel ARC PRO not working on Windows install. #4146

Closed
opened 2026-04-12 15:03:29 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @Solaris17 on GitHub (Sep 2, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6588

What is the issue?

I am having problems getting Ollama to run GPU accelerated. Even with the github pull req that pulls llama.cpp. I also installed openVINO. I am using an ARC Pro A40 and the system also has an iGPU. When attempting to run the software though the following is given in the logs.

This is a windows install.
https://pastebin.com/Zyn0DSNw

2024/09/01 20:51:22 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\\Users\\Solaris17\\.ollama\\models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR:C:\\Users\\Solaris17\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\runners OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]" time=2024-09-01T20:51:22.669-05:00 level=INFO source=images.go:753 msg="total blobs: 0" time=2024-09-01T20:51:22.669-05:00 level=INFO source=images.go:760 msg="total unused blobs removed: 0" time=2024-09-01T20:51:22.670-05:00 level=INFO source=routes.go:1172 msg="Listening on 127.0.0.1:11434 (version 0.3.9)" time=2024-09-01T20:51:22.686-05:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cuda_v12 rocm_v6.1 cpu cpu_avx cpu_avx2 cuda_v11]" time=2024-09-01T20:51:22.686-05:00 level=INFO source=gpu.go:200 msg="looking for compatible GPUs" time=2024-09-01T20:51:22.842-05:00 level=INFO source=gpu.go:347 msg="no compatible GPUs were discovered"

OS

Windows

GPU

Intel

CPU

Intel

Ollama version

0.3.9

Originally created by @Solaris17 on GitHub (Sep 2, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6588 ### What is the issue? I am having problems getting Ollama to run GPU accelerated. [Even with the github pull req that pulls llama.cpp](https://github.com/ollama/ollama/pull/3278). I also installed openVINO. I am using an ARC Pro A40 and the system also has an iGPU. When attempting to run the software though the following is given in the logs. This is a windows install. https://pastebin.com/Zyn0DSNw `2024/09/01 20:51:22 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\\Users\\Solaris17\\.ollama\\models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR:C:\\Users\\Solaris17\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\runners OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]" time=2024-09-01T20:51:22.669-05:00 level=INFO source=images.go:753 msg="total blobs: 0" time=2024-09-01T20:51:22.669-05:00 level=INFO source=images.go:760 msg="total unused blobs removed: 0" time=2024-09-01T20:51:22.670-05:00 level=INFO source=routes.go:1172 msg="Listening on 127.0.0.1:11434 (version 0.3.9)" time=2024-09-01T20:51:22.686-05:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cuda_v12 rocm_v6.1 cpu cpu_avx cpu_avx2 cuda_v11]" time=2024-09-01T20:51:22.686-05:00 level=INFO source=gpu.go:200 msg="looking for compatible GPUs" time=2024-09-01T20:51:22.842-05:00 level=INFO source=gpu.go:347 msg="no compatible GPUs were discovered"` ### OS Windows ### GPU Intel ### CPU Intel ### Ollama version 0.3.9
GiteaMirror added the bug label 2026-04-12 15:03:29 -05:00
Author
Owner

@dhiltgen commented on GitHub (Sep 3, 2024):

Intel ARC GPU support is tracked via #1590 (partial support is checked into the tree if you build from source, but it isn't in our official builds yet)

<!-- gh-comment-id:2327450945 --> @dhiltgen commented on GitHub (Sep 3, 2024): Intel ARC GPU support is tracked via #1590 (partial support is checked into the tree if you build from source, but it isn't in our official builds yet)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4146