[GH-ISSUE #2034] Not running on gpu #26935

Closed
opened 2026-04-22 03:43:49 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @DragonBtc93 on GitHub (Jan 17, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2034

I'm a Ubuntu 22.04 use have a Nvidia tesla p40 and a k80 gpu and it will not use gpu. I can use text generation webui and get gpu.

Originally created by @DragonBtc93 on GitHub (Jan 17, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2034 I'm a Ubuntu 22.04 use have a Nvidia tesla p40 and a k80 gpu and it will not use gpu. I can use text generation webui and get gpu.
Author
Owner

@dhiltgen commented on GitHub (Jan 26, 2024):

Assuming these are in the same system, the K80 is the problem. That GPU is a Compute Capability 3.7 card, while the P40 is a Compute Capability 6.1 card. 6.1 is supported today, but 3.5 is not yet supported, and tracked via issue #1756
We don't yet have a solid way to ignore unsupported cards and use supported cards, so we'll disable GPU mode if we detect any GPU that isn't supported. As a workaround until we fix #1756, you can pull the K80 and Ollama should run on the P40 GPU.

https://developer.nvidia.com/cuda-gpus

<!-- gh-comment-id:1912710243 --> @dhiltgen commented on GitHub (Jan 26, 2024): Assuming these are in the same system, the K80 is the problem. That GPU is a Compute Capability 3.7 card, while the P40 is a Compute Capability 6.1 card. 6.1 is supported today, but 3.5 is not yet supported, and tracked via issue #1756 We don't yet have a solid way to ignore unsupported cards and use supported cards, so we'll disable GPU mode if we detect any GPU that isn't supported. As a workaround until we fix #1756, you can pull the K80 and Ollama should run on the P40 GPU. https://developer.nvidia.com/cuda-gpus
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26935