[PR #2459] [MERGED] Always add token to cache_tokens #42143

Closed
opened 2026-04-24 21:55:54 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/2459
Author: @jmorganca
Created: 2/12/2024
Status: Merged
Merged: 2/12/2024
Merged by: @jmorganca

Base: mainHead: jmorganca/cachefix


📝 Commits (1)

  • c6a8b48 patch: always add token to cache_tokens

📊 Changes

1 file changed (+16 additions, -25 deletions)

View changed files

📝 llm/patches/01-cache.diff (+16 -25)

📄 Description

The diff is a bit hard to read, but this is the actual fix for our 01 patch that fixes due to the kv cache being full

I believe this fixes #2339 and #1458


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/2459 **Author:** [@jmorganca](https://github.com/jmorganca) **Created:** 2/12/2024 **Status:** ✅ Merged **Merged:** 2/12/2024 **Merged by:** [@jmorganca](https://github.com/jmorganca) **Base:** `main` ← **Head:** `jmorganca/cachefix` --- ### 📝 Commits (1) - [`c6a8b48`](https://github.com/ollama/ollama/commit/c6a8b4888a30250435880ef620241cbbd6a7d8ee) patch: always add token to cache_tokens ### 📊 Changes **1 file changed** (+16 additions, -25 deletions) <details> <summary>View changed files</summary> 📝 `llm/patches/01-cache.diff` (+16 -25) </details> ### 📄 Description The diff is a bit hard to read, but this is the actual fix for our `01` patch that fixes due to the kv cache being full I believe this fixes #2339 and #1458 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-24 21:55:54 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#42143