[PR #7456] [MERGED] update llama3.2 vision memory estimation #12429

Closed
opened 2026-04-12 23:58:56 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/7456
Author: @mxyng
Created: 10/31/2024
Status: Merged
Merged: 11/4/2024
Merged by: @mxyng

Base: mainHead: mxyng/llama3.2-vision-mem


📝 Commits (2)

📊 Changes

2 files changed (+44 additions, -8 deletions)

View changed files

📝 llm/ggml.go (+40 -1)
📝 llm/memory.go (+4 -7)

📄 Description

adjust estimations for mllama which has conditional graph components and a different cache shape


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/7456 **Author:** [@mxyng](https://github.com/mxyng) **Created:** 10/31/2024 **Status:** ✅ Merged **Merged:** 11/4/2024 **Merged by:** [@mxyng](https://github.com/mxyng) **Base:** `main` ← **Head:** `mxyng/llama3.2-vision-mem` --- ### 📝 Commits (2) - [`8c238e7`](https://github.com/ollama/ollama/commit/8c238e70abe715ebe099657d110ee3a00876cc53) mllama cross attention - [`d07cf41`](https://github.com/ollama/ollama/commit/d07cf41a97ea11e4f84f6df37997788d033f7e06) refactor kv estimation ### 📊 Changes **2 files changed** (+44 additions, -8 deletions) <details> <summary>View changed files</summary> 📝 `llm/ggml.go` (+40 -1) 📝 `llm/memory.go` (+4 -7) </details> ### 📄 Description adjust estimations for mllama which has conditional graph components and a different cache shape --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-12 23:58:56 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#12429