[GH-ISSUE #10760] API command PS - does not return GPU/CPU model allocation #53582

Closed
opened 2026-04-29 03:52:11 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @pupphelper on GitHub (May 17, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10760

What is the issue?

the Processor information that shows % CPU/GPU allocation when using ollama ps command line. However
This information is not provided in an API call.
http://localhost:11434/api/ps

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @pupphelper on GitHub (May 17, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10760 ### What is the issue? the Processor information that shows % CPU/GPU allocation when using ollama ps command line. However This information is not provided in an API call. http://localhost:11434/api/ps ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-29 03:52:11 -05:00
Author
Owner

@rick-github commented on GitHub (May 17, 2025):

$ ollama ps
NAME             ID              SIZE     PROCESSOR          UNTIL
llama4:latest    4f01ed6b6e01    70 GB    83%/17% CPU/GPU    Forever
$ curl -s localhost:11434/api/ps | jq -r '.models[]|"\(100-(.size_vram/.size)*100|ceil)%/\((.size_vram/.size)*100|floor)%"'
83%/17%
<!-- gh-comment-id:2888617175 --> @rick-github commented on GitHub (May 17, 2025): ```console $ ollama ps NAME ID SIZE PROCESSOR UNTIL llama4:latest 4f01ed6b6e01 70 GB 83%/17% CPU/GPU Forever $ curl -s localhost:11434/api/ps | jq -r '.models[]|"\(100-(.size_vram/.size)*100|ceil)%/\((.size_vram/.size)*100|floor)%"' 83%/17% ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#53582