[PR #13972] [MERGED] Llama update powerpc sync #40333

Closed
opened 2026-04-23 01:15:21 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/13972
Author: @gabe-l-hart
Created: 1/29/2026
Status: Merged
Merged: 2/2/2026
Merged by: @jmorganca

Base: llama-updateHead: llama-update-powerpc-sync


📝 Commits (2)

  • 035aee3 feat: Include ggml-cpu/arch/powerpc in ggml rsync
  • b765e03 feat: Sync ggml-cpu/arch/powerpc

📊 Changes

3 files changed (+2388 additions, -0 deletions)

View changed files

📝 ml/backend/ggml/ggml/.rsync-filter (+1 -0)
ml/backend/ggml/ggml/src/ggml-cpu/arch/powerpc/cpu-feats.cpp (+82 -0)
ml/backend/ggml/ggml/src/ggml-cpu/arch/powerpc/quants.c (+2305 -0)

📄 Description

Description

This PR extends the current vendor update PR (https://github.com/ollama/ollama/pull/13832) to include the additional backend ggml support for ggml-cpu/arc/powerpc that enables PowerPC support. This is one of the key pieces represented in https://github.com/ollama/ollama/pull/13543 to fully fix PowerPC support. I have not included the additional patches from that PR here to avoid conflating the issues fully, but since the current bump PR is actively pulling over new llama.cpp vendored code, it's the logical place to also add this sync mechanism.

Reference: https://github.com/ollama/ollama/pull/13832#issuecomment-3818298195


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/13972 **Author:** [@gabe-l-hart](https://github.com/gabe-l-hart) **Created:** 1/29/2026 **Status:** ✅ Merged **Merged:** 2/2/2026 **Merged by:** [@jmorganca](https://github.com/jmorganca) **Base:** `llama-update` ← **Head:** `llama-update-powerpc-sync` --- ### 📝 Commits (2) - [`035aee3`](https://github.com/ollama/ollama/commit/035aee3b98e666b42fbab724aa9aaa176360844f) feat: Include ggml-cpu/arch/powerpc in ggml rsync - [`b765e03`](https://github.com/ollama/ollama/commit/b765e03d192fe99e96a8008daeb76a0a843e997d) feat: Sync ggml-cpu/arch/powerpc ### 📊 Changes **3 files changed** (+2388 additions, -0 deletions) <details> <summary>View changed files</summary> 📝 `ml/backend/ggml/ggml/.rsync-filter` (+1 -0) ➕ `ml/backend/ggml/ggml/src/ggml-cpu/arch/powerpc/cpu-feats.cpp` (+82 -0) ➕ `ml/backend/ggml/ggml/src/ggml-cpu/arch/powerpc/quants.c` (+2305 -0) </details> ### 📄 Description ## Description This PR extends the current vendor update PR (https://github.com/ollama/ollama/pull/13832) to include the additional backend `ggml` support for `ggml-cpu/arc/powerpc` that enables PowerPC support. This is one of the key pieces represented in https://github.com/ollama/ollama/pull/13543 to fully fix PowerPC support. I have _not_ included the additional patches from that PR here to avoid conflating the issues fully, but since the current bump PR is actively pulling over new `llama.cpp` vendored code, it's the logical place to also add this sync mechanism. Reference: https://github.com/ollama/ollama/pull/13832#issuecomment-3818298195 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-23 01:15:21 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#40333