[GH-ISSUE #14733] GPU selection support #71588

Closed
opened 2026-05-05 02:13:22 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Eaken on GitHub (Mar 9, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14733

We have multiple GPUs, but we want Olama to only call one at runtime. Can we support this in Olama's settings? At present, this can also be achieved through setting environment variables, but it will affect other applications that also require GPU, so I hope to manually set it in ollama,thanks!

Originally created by @Eaken on GitHub (Mar 9, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14733 We have multiple GPUs, but we want Olama to only call one at runtime. Can we support this in Olama's settings? At present, this can also be achieved through setting environment variables, but it will affect other applications that also require GPU, so I hope to manually set it in ollama,thanks!
GiteaMirror added the feature request label 2026-05-05 02:13:22 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 9, 2026):

Depending on your GPU, set CUDA_VISIBLE_DEVICES, HIP_VISIBLE_DEVICES, ROCR_VISIBLE_DEVICES or GGML_VK_VISIBLE_DEVICES in the server environment. It will only change GPU selection for ollama, no other applications will be affected.

<!-- gh-comment-id:4023765105 --> @rick-github commented on GitHub (Mar 9, 2026): Depending on your GPU, set `CUDA_VISIBLE_DEVICES`, `HIP_VISIBLE_DEVICES`, `ROCR_VISIBLE_DEVICES` or `GGML_VK_VISIBLE_DEVICES` in the server environment. It will only change GPU selection for ollama, no other applications will be affected.
Author
Owner

@Eaken commented on GitHub (Mar 10, 2026):

Depending on your GPU, set CUDA_VISIBLE_DEVICES, HIP_VISIBLE_DEVICES, ROCR_VISIBLE_DEVICES or GGML_VK_VISIBLE_DEVICES in the server environment. It will only change GPU selection for ollama, no other applications will be affected.

HIP_VISIBLE_DEVICES works,thanks

<!-- gh-comment-id:4028069872 --> @Eaken commented on GitHub (Mar 10, 2026): > Depending on your GPU, set `CUDA_VISIBLE_DEVICES`, `HIP_VISIBLE_DEVICES`, `ROCR_VISIBLE_DEVICES` or `GGML_VK_VISIBLE_DEVICES` in the server environment. It will only change GPU selection for ollama, no other applications will be affected. HIP_VISIBLE_DEVICES works,thanks
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71588