[PR #10695] [MERGED] llama: fix defrag patch #39202

Closed
opened 2026-04-22 23:51:05 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/10695
Author: @jmorganca
Created: 5/13/2025
Status: Merged
Merged: 5/13/2025
Merged by: @jmorganca

Base: mainHead: jmorganca/fix-kv-cache-patch


📝 Commits (1)

📊 Changes

2 files changed (+50 additions, -7 deletions)

View changed files

📝 llama/llama.cpp/src/llama-context.cpp (+12 -6)
📝 llama/patches/0010-ensure-KV-cache-is-fully-defragmented.patch (+38 -1)

📄 Description

When updating this patch, the logic to actually exercise the defrag function wasn't ported properly.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/10695 **Author:** [@jmorganca](https://github.com/jmorganca) **Created:** 5/13/2025 **Status:** ✅ Merged **Merged:** 5/13/2025 **Merged by:** [@jmorganca](https://github.com/jmorganca) **Base:** `main` ← **Head:** `jmorganca/fix-kv-cache-patch` --- ### 📝 Commits (1) - [`939ccf1`](https://github.com/ollama/ollama/commit/939ccf124020a4e020872b67799ca44e9f328bb6) llama: fix defrag patch ### 📊 Changes **2 files changed** (+50 additions, -7 deletions) <details> <summary>View changed files</summary> 📝 `llama/llama.cpp/src/llama-context.cpp` (+12 -6) 📝 `llama/patches/0010-ensure-KV-cache-is-fully-defragmented.patch` (+38 -1) </details> ### 📄 Description When updating this patch, the logic to actually exercise the defrag function wasn't ported properly. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-22 23:51:05 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#39202