[GH-ISSUE #2968] How to specify which GPU to use when starting Ollama #27583

Closed
opened 2026-04-22 05:02:49 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @lyj555 on GitHub (Mar 7, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2968

Currently, on Linux, there are two GPUs, and it seems that Ollama is occupying a part of both.
企业微信20240307-101729@2x

Originally created by @lyj555 on GitHub (Mar 7, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2968 Currently, on Linux, there are two GPUs, and it seems that Ollama is occupying a part of both. <img width="796" alt="企业微信20240307-101729@2x" src="https://github.com/ollama/ollama/assets/20056271/66ececfe-5a0f-471a-a41c-f9373c8510e7">
Author
Owner

@samavedulark commented on GitHub (Mar 8, 2024):

if you are using docker, which is a better choice, run the below command.
Select an un-occupied GPU just to be clear.
docker run -d --gpus 'device=1' -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

<!-- gh-comment-id:1985052208 --> @samavedulark commented on GitHub (Mar 8, 2024): if you are using docker, which is a better choice, run the below command. Select an un-occupied GPU just to be clear. docker run -d --gpus 'device=1' -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
Author
Owner

@jmorganca commented on GitHub (Mar 12, 2024):

Hi there, there isn't a direct way to decide which GPU to use for Ollama yet. This is something I hope will becoming soon! In the meantime there are some workarounds like CUDA_VISIBLE_DEVICES (https://developer.nvidia.com/blog/cuda-pro-tip-control-gpu-visibility-cuda_visible_devices/) or the Docker workaround @samavedulark mentioned. Stay tuned for more!

<!-- gh-comment-id:1989670207 --> @jmorganca commented on GitHub (Mar 12, 2024): Hi there, there isn't a direct way to decide which GPU to use for Ollama yet. This is something I hope will becoming soon! In the meantime there are some workarounds like `CUDA_VISIBLE_DEVICES` (https://developer.nvidia.com/blog/cuda-pro-tip-control-gpu-visibility-cuda_visible_devices/) or the Docker workaround @samavedulark mentioned. Stay tuned for more!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27583