[PR #11331] [MERGED] API/CLI context enhancements #12248

Closed
opened 2025-11-12 16:31:52 -06:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/11331
Author: @dhiltgen
Created: 7/8/2025
Status: Merged
Merged: 7/8/2025
Merged by: @dhiltgen

Base: mainHead: ctx


📝 Commits (2)

  • 06b07eb API: expose context size of loaded models
  • 50f488c CLI: add context UX

📊 Changes

3 files changed (+14 additions, -9 deletions)

View changed files

📝 api/types.go (+8 -7)
📝 cmd/cmd.go (+3 -2)
📝 server/routes.go (+3 -0)

📄 Description

This extends the ps API to expose the loaded models context size.
The ps CLI command has a column added to expose the context size.

Example usage:

% ollama run llama3.2 hello
Hello! How can I assist you today?

% ollama ps
NAME               ID              SIZE      PROCESSOR    CONTEXT    UNTIL              
llama3.2:latest    a80c4f17acd5    4.0 GB    100% GPU     4096       4 minutes from now

% ollama run llama3.2
>>> /set parameter num_ctx 8192
Set parameter 'num_ctx' to '8192'
>>> hello
Hello! How can I assist you today?

>>> /bye
% ollama ps                               
NAME               ID              SIZE      PROCESSOR    CONTEXT    UNTIL              
llama3.2:latest    a80c4f17acd5    5.4 GB    100% GPU     8192       4 minutes from now

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/11331 **Author:** [@dhiltgen](https://github.com/dhiltgen) **Created:** 7/8/2025 **Status:** ✅ Merged **Merged:** 7/8/2025 **Merged by:** [@dhiltgen](https://github.com/dhiltgen) **Base:** `main` ← **Head:** `ctx` --- ### 📝 Commits (2) - [`06b07eb`](https://github.com/ollama/ollama/commit/06b07eb47a886b8f26f79942cead2e33a7a9f84f) API: expose context size of loaded models - [`50f488c`](https://github.com/ollama/ollama/commit/50f488c7a10a6b0889ef14a518191fa29c2a8c57) CLI: add context UX ### 📊 Changes **3 files changed** (+14 additions, -9 deletions) <details> <summary>View changed files</summary> 📝 `api/types.go` (+8 -7) 📝 `cmd/cmd.go` (+3 -2) 📝 `server/routes.go` (+3 -0) </details> ### 📄 Description This extends the `ps` API to expose the loaded models context size. The `ps` CLI command has a column added to expose the context size. Example usage: ``` % ollama run llama3.2 hello Hello! How can I assist you today? % ollama ps NAME ID SIZE PROCESSOR CONTEXT UNTIL llama3.2:latest a80c4f17acd5 4.0 GB 100% GPU 4096 4 minutes from now % ollama run llama3.2 >>> /set parameter num_ctx 8192 Set parameter 'num_ctx' to '8192' >>> hello Hello! How can I assist you today? >>> /bye % ollama ps NAME ID SIZE PROCESSOR CONTEXT UNTIL llama3.2:latest a80c4f17acd5 5.4 GB 100% GPU 8192 4 minutes from now ``` --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2025-11-12 16:31:52 -06:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama-ollama#12248