[GH-ISSUE #7945] how to specify GPU number when run an ollama model? #5086

Closed
opened 2026-04-12 16:10:59 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @cqray1990 on GitHub (Dec 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7945

What is the issue?

how to specify GPU number when run an ollama model?

OS

Linux

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @cqray1990 on GitHub (Dec 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7945 ### What is the issue? how to specify GPU number when run an ollama model? ### OS Linux ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-12 16:10:59 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5086