[GH-ISSUE #4186] Tokenize and Detokenize API For Token Count #64643

Closed
opened 2026-05-03 18:25:14 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @sslx on GitHub (May 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4186

For rag purposes, I'd love to find out the token count for text before feeding to a model for a response.
Could you connect api points for tokenize and detokenize on llama.cpp?
Thanks!

Originally created by @sslx on GitHub (May 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4186 For rag purposes, I'd love to find out the token count for text before feeding to a model for a response. Could you connect api points for tokenize and detokenize on llama.cpp? Thanks!
GiteaMirror added the feature request label 2026-05-03 18:25:14 -05:00
Author
Owner

@jmorganca commented on GitHub (Jun 4, 2024):

Hi there thanks for the issue! Merging with https://github.com/ollama/ollama/issues/3021

<!-- gh-comment-id:2148528744 --> @jmorganca commented on GitHub (Jun 4, 2024): Hi there thanks for the issue! Merging with https://github.com/ollama/ollama/issues/3021
Author
Owner

@functorism commented on GitHub (Jul 5, 2024):

@jmorganca This request seems to reappear regularly. Could you share if there's a reason it hasn't been addressed yet; and if pull requests for it are welcome?

<!-- gh-comment-id:2211135313 --> @functorism commented on GitHub (Jul 5, 2024): @jmorganca This request seems to reappear regularly. Could you share if there's a reason it hasn't been addressed yet; and if pull requests for it are welcome?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64643