[PR #11331] API/CLI context enhancements #13511

Closed
opened 2026-04-13 00:29:12 -05:00 by GiteaMirror · 0 comments
Owner

Original Pull Request: https://github.com/ollama/ollama/pull/11331

State: closed
Merged: Yes


This extends the ps API to expose the loaded models context size.
The ps CLI command has a column added to expose the context size.

Example usage:

% ollama run llama3.2 hello
Hello! How can I assist you today?

% ollama ps
NAME               ID              SIZE      PROCESSOR    CONTEXT    UNTIL              
llama3.2:latest    a80c4f17acd5    4.0 GB    100% GPU     4096       4 minutes from now

% ollama run llama3.2
>>> /set parameter num_ctx 8192
Set parameter 'num_ctx' to '8192'
>>> hello
Hello! How can I assist you today?

>>> /bye
% ollama ps                               
NAME               ID              SIZE      PROCESSOR    CONTEXT    UNTIL              
llama3.2:latest    a80c4f17acd5    5.4 GB    100% GPU     8192       4 minutes from now
**Original Pull Request:** https://github.com/ollama/ollama/pull/11331 **State:** closed **Merged:** Yes --- This extends the `ps` API to expose the loaded models context size. The `ps` CLI command has a column added to expose the context size. Example usage: ``` % ollama run llama3.2 hello Hello! How can I assist you today? % ollama ps NAME ID SIZE PROCESSOR CONTEXT UNTIL llama3.2:latest a80c4f17acd5 4.0 GB 100% GPU 4096 4 minutes from now % ollama run llama3.2 >>> /set parameter num_ctx 8192 Set parameter 'num_ctx' to '8192' >>> hello Hello! How can I assist you today? >>> /bye % ollama ps NAME ID SIZE PROCESSOR CONTEXT UNTIL llama3.2:latest a80c4f17acd5 5.4 GB 100% GPU 8192 4 minutes from now ```
GiteaMirror added the pull-request label 2026-04-13 00:29:12 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#13511