[PR #8835] [MERGED] llama: use dynamic backend loading for mllama and clip #12790

Closed
opened 2026-04-13 00:09:45 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/8835
Author: @jmorganca
Created: 2/5/2025
Status: Merged
Merged: 2/5/2025
Merged by: @jmorganca

Base: mainHead: jmorganca/clip-dynamic


📝 Commits (1)

  • f78716e llama: use dynamic backend loading for mllama and clip

📊 Changes

3 files changed (+36 additions, -82 deletions)

View changed files

📝 llama/llama.cpp/examples/llava/clip.cpp (+8 -28)
📝 llama/mllama.cpp (+8 -23)
📝 llama/patches/0013-use-dynamic-backend-loading-for-clip.patch (+20 -31)

📄 Description

No description provided


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/8835 **Author:** [@jmorganca](https://github.com/jmorganca) **Created:** 2/5/2025 **Status:** ✅ Merged **Merged:** 2/5/2025 **Merged by:** [@jmorganca](https://github.com/jmorganca) **Base:** `main` ← **Head:** `jmorganca/clip-dynamic` --- ### 📝 Commits (1) - [`f78716e`](https://github.com/ollama/ollama/commit/f78716e086a1b76ecab18189f442b28694bd4f60) llama: use dynamic backend loading for mllama and clip ### 📊 Changes **3 files changed** (+36 additions, -82 deletions) <details> <summary>View changed files</summary> 📝 `llama/llama.cpp/examples/llava/clip.cpp` (+8 -28) 📝 `llama/mllama.cpp` (+8 -23) 📝 `llama/patches/0013-use-dynamic-backend-loading-for-clip.patch` (+20 -31) </details> ### 📄 Description _No description provided_ --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-13 00:09:45 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#12790