[GH-ISSUE #12323] 🤔v0.11.11 stops running models in GPU and switchs to using only CPU while "ollama ps" shows models as still 100% in GPU! #70247

Closed
opened 2026-05-04 20:46:31 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @FieldMouse-AI on GitHub (Sep 17, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12323

What is the issue?

After about a couple of days running, Ollama v0.11.11 stops running models in GPU and switchs to using only CPU while "ollama ps" shows models as still 100% in GPU.

I find that simply restarting the ollama container makes things work again, but it is concerning that Ollama gets into that state:

$ docker compose -f docker-compose-gpu.yaml down ollama
$ docker compose -f docker-compose-gpu.yaml up -d ollama

My system configuration:

  • OS: Host and Docker: Ubuntu Linux 22.04.5 LTS
  • CPU: Intel i5-6500
  • RAM: 40GB
  • GPU NVIDIA RTX 3060 12GB VRAM
  • Ollama version: 0.11.11

Thank you for your time.

Relevant log output


OS

Docker

GPU

Nvidia

CPU

Intel

Ollama version

0.11.11

Originally created by @FieldMouse-AI on GitHub (Sep 17, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12323 ### What is the issue? **After about a couple of days running**, Ollama v0.11.11 stops running models in GPU and switchs to using only CPU while "ollama ps" shows models as still 100% in GPU. I find that simply restarting the ollama container makes things work again, but it is concerning that Ollama gets into that state: ``` $ docker compose -f docker-compose-gpu.yaml down ollama $ docker compose -f docker-compose-gpu.yaml up -d ollama ``` My system configuration: - OS: Host and Docker: Ubuntu Linux 22.04.5 LTS - CPU: Intel i5-6500 - RAM: 40GB - GPU NVIDIA RTX 3060 12GB VRAM - Ollama version: 0.11.11 Thank you for your time. ### Relevant log output ```shell ``` ### OS Docker ### GPU Nvidia ### CPU Intel ### Ollama version 0.11.11
GiteaMirror added the bug label 2026-05-04 20:46:32 -05:00
Author
Owner
<!-- gh-comment-id:3304360384 --> @rick-github commented on GitHub (Sep 17, 2025): https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#linux-docker
Author
Owner

@FieldMouse-AI commented on GitHub (Sep 17, 2025):

https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#linux-docker

Ah! Thanks for the reference!

It just so happens my Docker host already has /etc/docker/daemon.json, so I can make this change, and test it out.

<!-- gh-comment-id:3304376141 --> @FieldMouse-AI commented on GitHub (Sep 17, 2025): > https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#linux-docker Ah! Thanks for the reference! It just so happens my Docker host already has `/etc/docker/daemon.json`, so I can make this change, and test it out.
Author
Owner

@pdevine commented on GitHub (Sep 17, 2025):

Going to go ahead and mark this as answered (ty @rick-github !), but we can reopen it if you're still having the issue.

<!-- gh-comment-id:3304387476 --> @pdevine commented on GitHub (Sep 17, 2025): Going to go ahead and mark this as answered (ty @rick-github !), but we can reopen it if you're still having the issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#70247