[PR #15691] add token calculation support with UI display (#15639) #61962

Open
opened 2026-04-29 16:55:52 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/15691
Author: @KunjShah95
Created: 4/19/2026
Status: 🔄 Open

Base: mainHead: main


📝 Commits (7)

  • d2c3751 feat: add token calculation support with UI display (#15639)
  • 0610c08 feat: add token calculation support with UI display (#15639)
  • 7059cc9 Update server/tokenize.go
  • 89d91dc Update server/tokenize.go
  • 5a9e6a6 Merge branch 'main' into tokenize-pr
  • d939570 Merge pull request #1 from KunjShah95/tokenize-pr
  • 19b96ee Update api/types.go

📊 Changes

5 files changed (+261 additions, -3 deletions)

View changed files

📝 api/types.go (+41 -0)
📝 app/ui/app/src/components/Chat.tsx (+56 -3)
📝 app/ui/app/src/components/Message.tsx (+36 -0)
📝 server/routes.go (+1 -0)
server/tokenize.go (+127 -0)

📄 Description

Adds token calculation support for issue #15639 by introducing a new /api/tokenize endpoint and displaying token usage in the chat UI.

Backend: Tokenization API

Adds request/response types for tokenization.
Implements POST /api/tokenize to return token counts for either a raw prompt or chat messages, using the selected model’s tokenizer for accurate counts.
Handles models that can’t be tokenized locally (e.g., cloud/thinking models) gracefully.
Frontend: Token Usage Display

Shows per-message token stats via a TokenStats component.
Shows session totals in the chat footer via SessionTokenStats.
Example
curl -X POST http://localhost:11434/api/tokenize
-d '{"model":"llama3.2","messages":[{"role":"user","content":"Hello"}]}'

Response (example)
{"tokens":25,"input_tokens":25,"output_tokens":6}


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/15691 **Author:** [@KunjShah95](https://github.com/KunjShah95) **Created:** 4/19/2026 **Status:** 🔄 Open **Base:** `main` ← **Head:** `main` --- ### 📝 Commits (7) - [`d2c3751`](https://github.com/ollama/ollama/commit/d2c37512cbf4ae1c7ec5b0c10a5fcc2bf809042f) feat: add token calculation support with UI display (#15639) - [`0610c08`](https://github.com/ollama/ollama/commit/0610c0870b47211fa39ce1a17c3584240270fb23) feat: add token calculation support with UI display (#15639) - [`7059cc9`](https://github.com/ollama/ollama/commit/7059cc965455fabe1a354576ddd39432c9741e8e) Update server/tokenize.go - [`89d91dc`](https://github.com/ollama/ollama/commit/89d91dced4571228e54b960fa0c1c04bbad41d6d) Update server/tokenize.go - [`5a9e6a6`](https://github.com/ollama/ollama/commit/5a9e6a602b790c45d093d16154d2d45986d517d8) Merge branch 'main' into tokenize-pr - [`d939570`](https://github.com/ollama/ollama/commit/d939570ee7a9b90992fd09d5da1ac47971fab07c) Merge pull request #1 from KunjShah95/tokenize-pr - [`19b96ee`](https://github.com/ollama/ollama/commit/19b96ee397c8e3f767c41d510d18779a6906e89a) Update api/types.go ### 📊 Changes **5 files changed** (+261 additions, -3 deletions) <details> <summary>View changed files</summary> 📝 `api/types.go` (+41 -0) 📝 `app/ui/app/src/components/Chat.tsx` (+56 -3) 📝 `app/ui/app/src/components/Message.tsx` (+36 -0) 📝 `server/routes.go` (+1 -0) ➕ `server/tokenize.go` (+127 -0) </details> ### 📄 Description Adds token calculation support for issue #15639 by introducing a new /api/tokenize endpoint and displaying token usage in the chat UI. Backend: Tokenization API Adds request/response types for tokenization. Implements POST /api/tokenize to return token counts for either a raw prompt or chat messages, using the selected model’s tokenizer for accurate counts. Handles models that can’t be tokenized locally (e.g., cloud/thinking models) gracefully. Frontend: Token Usage Display Shows per-message token stats via a TokenStats component. Shows session totals in the chat footer via SessionTokenStats. Example curl -X POST http://localhost:11434/api/tokenize \ -d '{"model":"llama3.2","messages":[{"role":"user","content":"Hello"}]}' Response (example) {"tokens":25,"input_tokens":25,"output_tokens":6} --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-29 16:55:52 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#61962