[GH-ISSUE #9624] How to Force Ollama to Run on CPU Instead of GPU on Windows? #6280

Closed
opened 2026-04-12 17:42:09 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @dilshodbek-nodejs on GitHub (Mar 10, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9624

What is the issue?

I expected the model to run only on my CPU without using the GPU.

Relevant log output


OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.5.13

Originally created by @dilshodbek-nodejs on GitHub (Mar 10, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9624 ### What is the issue? I expected the model to run only on my CPU without using the GPU. ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.5.13
GiteaMirror added the question label 2026-04-12 17:42:09 -05:00
Author
Owner

@krenax commented on GitHub (Mar 10, 2025):

Could you please provide the missing log output?

<!-- gh-comment-id:2711015991 --> @krenax commented on GitHub (Mar 10, 2025): Could you please provide the missing log output?
Author
Owner

@rick-github commented on GitHub (Mar 10, 2025):

https://github.com/ollama/ollama/issues/6950#issuecomment-2373663650

<!-- gh-comment-id:2711136463 --> @rick-github commented on GitHub (Mar 10, 2025): https://github.com/ollama/ollama/issues/6950#issuecomment-2373663650
Author
Owner

@pdevine commented on GitHub (Mar 12, 2025):

I'm going to go ahead and close this as answered.

<!-- gh-comment-id:2716131074 --> @pdevine commented on GitHub (Mar 12, 2025): I'm going to go ahead and close this as answered.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6280