[GH-ISSUE #9375] Update PS API to show whether a model is active or not #31880

Open
opened 2026-04-22 12:39:31 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @ankh2054 on GitHub (Feb 26, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9375

Hi,

Wondering if there is scope to expand ollama ps, to show whether inference is currently happening OR what the current load if on the GPU.
Currently when using ollama ps, it shows historical values. It would be great to be able to see whether a ollama server is currently busy.

Originally created by @ankh2054 on GitHub (Feb 26, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9375 Hi, Wondering if there is scope to expand ollama ps, to show whether inference is currently happening OR what the current load if on the GPU. Currently when using ollama ps, it shows historical values. It would be great to be able to see whether a ollama server is currently busy.
GiteaMirror added the feature request label 2026-04-22 12:39:31 -05:00
Author
Owner

@rick-github commented on GitHub (Feb 26, 2025):

ollama ps doesn't show GPU load, it shows how much of the model is loaded on the GPU/CPU. If you want to monitor load, Windows has a system monitor and Linux users use nvtop.

<!-- gh-comment-id:2686155280 --> @rick-github commented on GitHub (Feb 26, 2025): `ollama ps` doesn't show GPU load, it shows how much of the model is loaded on the GPU/CPU. If you want to monitor load, Windows has a system monitor and Linux users use `nvtop`.
Author
Owner

@ankh2054 commented on GitHub (Feb 26, 2025):

I wasn't explaining myself correctly, I was thinking around whether a model is currently active. I think I have found a way by looking at the refCount, which I believe shows how many active requests are using the model. Need to do some testing on my end first.

will report back.

<!-- gh-comment-id:2686227666 --> @ankh2054 commented on GitHub (Feb 26, 2025): I wasn't explaining myself correctly, I was thinking around whether a model is currently active. I think I have found a way by looking at the refCount, which I believe shows how many active requests are using the model. Need to do some testing on my end first. will report back.
Author
Owner

@rick-github commented on GitHub (Feb 26, 2025):

https://github.com/ollama/ollama/issues/3144

<!-- gh-comment-id:2686238308 --> @rick-github commented on GitHub (Feb 26, 2025): https://github.com/ollama/ollama/issues/3144
Author
Owner

@ankh2054 commented on GitHub (Feb 27, 2025):

Okay I can confirm it works, I have created a pull request. This is my first time doing a PR for Ollama, so apologies if I did something wrong.
https://github.com/ollama/ollama/pull/9392

Example shows whether a model is currently active or not.

user@unknown ~/ollama (feature/active-inference-status)> ./ollama ps
NAME              ID              SIZE      PROCESSOR    ACTIVE    UNTIL              
deepseek-r1:7b    0a8c26691023    6.0 GB    100% GPU     Yes       4 minutes from now    
user@unknown ~/ollama (feature/active-inference-status)> ./ollama ps
NAME              ID              SIZE      PROCESSOR    ACTIVE    UNTIL              
deepseek-r1:7b    0a8c26691023    6.0 GB    100% GPU     No        4 minutes from now
<!-- gh-comment-id:2687627819 --> @ankh2054 commented on GitHub (Feb 27, 2025): Okay I can confirm it works, I have created a pull request. This is my first time doing a PR for Ollama, so apologies if I did something wrong. https://github.com/ollama/ollama/pull/9392 Example shows whether a model is currently active or not. ``` user@unknown ~/ollama (feature/active-inference-status)> ./ollama ps NAME ID SIZE PROCESSOR ACTIVE UNTIL deepseek-r1:7b 0a8c26691023 6.0 GB 100% GPU Yes 4 minutes from now user@unknown ~/ollama (feature/active-inference-status)> ./ollama ps NAME ID SIZE PROCESSOR ACTIVE UNTIL deepseek-r1:7b 0a8c26691023 6.0 GB 100% GPU No 4 minutes from now ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#31880