[GH-ISSUE #6774] Add Tokenizer functionality to API #4270

Open
opened 2026-04-12 15:12:07 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @Master-Pr0grammer on GitHub (Sep 12, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6774

Having access to the models tokenizer is extremely useful for counting tokens, and managing the context window. In a lot of cases its essential to get an LLM implementation to work properly. The model already has the tokenizer loaded, and ollama's backend, llama.cpp, already has an interface for the tokenizer, so it shouldn't be that difficult to implement into the API.

Unless this is already a functionality on the API, in which case I'm sorry, but I just didn't see it in the documentation.

Originally created by @Master-Pr0grammer on GitHub (Sep 12, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6774 Having access to the models tokenizer is extremely useful for counting tokens, and managing the context window. In a lot of cases its essential to get an LLM implementation to work properly. The model already has the tokenizer loaded, and ollama's backend, llama.cpp, already has an interface for the tokenizer, so it shouldn't be that difficult to implement into the API. Unless this is already a functionality on the API, in which case I'm sorry, but I just didn't see it in the documentation.
GiteaMirror added the feature requestapi labels 2026-04-12 15:12:07 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 13, 2024):

https://github.com/ollama/ollama/pull/6586

<!-- gh-comment-id:2347833691 --> @rick-github commented on GitHub (Sep 13, 2024): https://github.com/ollama/ollama/pull/6586
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4270