[PR #15154] fix: memory leak when when vocab loading from file failed #20323

Open
opened 2026-04-16 07:33:14 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/15154
Author: @EugeneKirian
Created: 3/30/2026
Status: 🔄 Open

Base: mainHead: memory-leak-fix


📝 Commits (1)

  • d7c8d60 Fix memory leak when an exception is thrown during loading llama vocab from file.

📊 Changes

1 file changed (+1 additions, -0 deletions)

View changed files

📝 llama/sampling_ext.cpp (+1 -0)

📄 Description

Summary

There is a memory leak that happens when an exception is thrown during loading a vocab from file.

Fix

Delete created object in catch block.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/15154 **Author:** [@EugeneKirian](https://github.com/EugeneKirian) **Created:** 3/30/2026 **Status:** 🔄 Open **Base:** `main` ← **Head:** `memory-leak-fix` --- ### 📝 Commits (1) - [`d7c8d60`](https://github.com/ollama/ollama/commit/d7c8d60a138448c9971adc6dbc790e5cec68f164) Fix memory leak when an exception is thrown during loading llama vocab from file. ### 📊 Changes **1 file changed** (+1 additions, -0 deletions) <details> <summary>View changed files</summary> 📝 `llama/sampling_ext.cpp` (+1 -0) </details> ### 📄 Description # Summary There is a memory leak that happens when an exception is thrown during loading a vocab from file. # Fix Delete created object in catch block. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-16 07:33:14 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#20323