[PR #12992] ggml update to b7009 #12769

Open
opened 2025-11-12 17:06:24 -06:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/12992
Author: @dhiltgen
Created: 11/7/2025
Status: 🔄 Open

Base: mainHead: ggml_bump


📝 Commits (4)

  • 65b8fcb ggml update to b7009
  • 56449fd fix bakllava regression
  • 2cde1df remove patch for embeddings
  • 9f17c22 embeddings: plumb through embeddings setting in old engine

📊 Changes

251 files changed (+23395 additions, -17471 deletions)

View changed files

📝 Makefile.sync (+1 -1)
📝 fs/ggml/ggml.go (+1 -0)
📝 llama/build-info.cpp (+1 -1)
📝 llama/llama.cpp/.rsync-filter (+3 -0)
📝 llama/llama.cpp/common/common.cpp (+33 -0)
📝 llama/llama.cpp/common/common.h (+15 -1)
📝 llama/llama.cpp/common/json-schema-to-grammar.cpp (+19 -3)
📝 llama/llama.cpp/include/llama.h (+7 -3)
📝 llama/llama.cpp/src/llama-arch.cpp (+108 -0)
📝 llama/llama.cpp/src/llama-arch.h (+11 -0)
📝 llama/llama.cpp/src/llama-batch.cpp (+63 -31)
📝 llama/llama.cpp/src/llama-batch.h (+12 -1)
📝 llama/llama.cpp/src/llama-chat.cpp (+32 -0)
📝 llama/llama.cpp/src/llama-chat.h (+1 -0)
📝 llama/llama.cpp/src/llama-context.cpp (+46 -17)
📝 llama/llama.cpp/src/llama-context.h (+5 -5)
📝 llama/llama.cpp/src/llama-cparams.h (+1 -0)
📝 llama/llama.cpp/src/llama-graph.cpp (+12 -7)
📝 llama/llama.cpp/src/llama-hparams.cpp (+11 -1)
📝 llama/llama.cpp/src/llama-hparams.h (+6 -0)

...and 80 more files

📄 Description

solar-pro required adjusting to the recent model definition refactoring upstream

Cursory testing on Metal looks promising. Draft until tested on other platforms.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/12992 **Author:** [@dhiltgen](https://github.com/dhiltgen) **Created:** 11/7/2025 **Status:** 🔄 Open **Base:** `main` ← **Head:** `ggml_bump` --- ### 📝 Commits (4) - [`65b8fcb`](https://github.com/ollama/ollama/commit/65b8fcb288e3a1d2e972712f46a1b69d94000fbd) ggml update to b7009 - [`56449fd`](https://github.com/ollama/ollama/commit/56449fd5e7794e2cbc82ef28363b0362e7c286b9) fix bakllava regression - [`2cde1df`](https://github.com/ollama/ollama/commit/2cde1df8b2e7d8cbf8985fa689199fc0df88df1b) remove patch for embeddings - [`9f17c22`](https://github.com/ollama/ollama/commit/9f17c22d2557b86bf389f185cede118ef452ff3a) embeddings: plumb through embeddings setting in old engine ### 📊 Changes **251 files changed** (+23395 additions, -17471 deletions) <details> <summary>View changed files</summary> 📝 `Makefile.sync` (+1 -1) 📝 `fs/ggml/ggml.go` (+1 -0) 📝 `llama/build-info.cpp` (+1 -1) 📝 `llama/llama.cpp/.rsync-filter` (+3 -0) 📝 `llama/llama.cpp/common/common.cpp` (+33 -0) 📝 `llama/llama.cpp/common/common.h` (+15 -1) 📝 `llama/llama.cpp/common/json-schema-to-grammar.cpp` (+19 -3) 📝 `llama/llama.cpp/include/llama.h` (+7 -3) 📝 `llama/llama.cpp/src/llama-arch.cpp` (+108 -0) 📝 `llama/llama.cpp/src/llama-arch.h` (+11 -0) 📝 `llama/llama.cpp/src/llama-batch.cpp` (+63 -31) 📝 `llama/llama.cpp/src/llama-batch.h` (+12 -1) 📝 `llama/llama.cpp/src/llama-chat.cpp` (+32 -0) 📝 `llama/llama.cpp/src/llama-chat.h` (+1 -0) 📝 `llama/llama.cpp/src/llama-context.cpp` (+46 -17) 📝 `llama/llama.cpp/src/llama-context.h` (+5 -5) 📝 `llama/llama.cpp/src/llama-cparams.h` (+1 -0) 📝 `llama/llama.cpp/src/llama-graph.cpp` (+12 -7) 📝 `llama/llama.cpp/src/llama-hparams.cpp` (+11 -1) 📝 `llama/llama.cpp/src/llama-hparams.h` (+6 -0) _...and 80 more files_ </details> ### 📄 Description solar-pro required adjusting to the recent model definition refactoring upstream Cursory testing on Metal looks promising. Draft until tested on other platforms. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the
pull-request
label 2025-11-12 17:06:24 -06:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama-ollama#12769
No description provided.