[PR #3190] [CLOSED] update llama.cpp submodule to 12247f4 (release tag: b2440) #11093

Closed
opened 2026-04-12 23:21:01 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/3190
Author: @acanis
Created: 3/17/2024
Status: Closed

Base: mainHead: main


📝 Commits (1)

  • bb1d537 update llama.cpp submodule to 12247f4 (release tag: b2440)

📊 Changes

4 files changed (+20 additions, -26 deletions)

View changed files

📝 llm/llama.cpp (+1 -1)
📝 llm/patches/01-cache.diff (+6 -8)
📝 llm/patches/02-cudaleaks.diff (+6 -6)
📝 llm/patches/05-fix-clip-free.diff (+7 -11)

📄 Description

Adding support for Command-R model #3100


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/3190 **Author:** [@acanis](https://github.com/acanis) **Created:** 3/17/2024 **Status:** ❌ Closed **Base:** `main` ← **Head:** `main` --- ### 📝 Commits (1) - [`bb1d537`](https://github.com/ollama/ollama/commit/bb1d5371ad72459efaa2e30a430f522b61a4c6dd) update llama.cpp submodule to `12247f4` (release tag: b2440) ### 📊 Changes **4 files changed** (+20 additions, -26 deletions) <details> <summary>View changed files</summary> 📝 `llm/llama.cpp` (+1 -1) 📝 `llm/patches/01-cache.diff` (+6 -8) 📝 `llm/patches/02-cudaleaks.diff` (+6 -6) 📝 `llm/patches/05-fix-clip-free.diff` (+7 -11) </details> ### 📄 Description Adding support for Command-R model #3100 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-12 23:21:01 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#11093