[PR #8161] Set n_ubatch parameter to same batch size as n_batch #59346

Open
opened 2026-04-29 14:16:56 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/8161
Author: @s-kostyaev
Created: 12/18/2024
Status: 🔄 Open

Base: mainHead: fix-panic-on-batch-embed


📝 Commits (1)

  • 2c14720 Set n_ubatch parameter to same batch size as n_batch

📊 Changes

1 file changed (+1 additions, -0 deletions)

View changed files

📝 llama/llama.go (+1 -0)

📄 Description

This change prevents panic during batch embeddings calculation

Relates to https://github.com/ollama/ollama/issues/3554

See also https://github.com/ggerganov/llama.cpp/issues/6263


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/8161 **Author:** [@s-kostyaev](https://github.com/s-kostyaev) **Created:** 12/18/2024 **Status:** 🔄 Open **Base:** `main` ← **Head:** `fix-panic-on-batch-embed` --- ### 📝 Commits (1) - [`2c14720`](https://github.com/ollama/ollama/commit/2c14720ff1bfeba72a73f1afcb3972aaa76cc940) Set n_ubatch parameter to same batch size as n_batch ### 📊 Changes **1 file changed** (+1 additions, -0 deletions) <details> <summary>View changed files</summary> 📝 `llama/llama.go` (+1 -0) </details> ### 📄 Description This change prevents panic during batch embeddings calculation Relates to https://github.com/ollama/ollama/issues/3554 See also https://github.com/ggerganov/llama.cpp/issues/6263 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-29 14:16:56 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#59346