[PR #14866] [CLOSED] fix: properly propagate error from reserveWorstCaseGraph in allocModel #77177

Closed
opened 2026-05-05 09:51:55 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/14866
Author: @LincolnBurrows2017
Created: 3/15/2026
Status: Closed

Base: mainHead: fix/allocmodel-error-handling


📝 Commits (1)

  • 83b1d93 fix: properly propagate error from reserveWorstCaseGraph in allocModel

📊 Changes

1 file changed (+1 additions, -1 deletions)

View changed files

📝 runner/ollamarunner/runner.go (+1 -1)

📄 Description

Description

The allocModel() function in runner/ollamarunner/runner.go was silently discarding errors from reserveWorstCaseGraph(true) by returning nil instead of err.

The Bug

err = s.reserveWorstCaseGraph(true)
if err != nil {
    return nil  // BUG: should be return err
}

This could cause issues when prompt graph reservation fails due to memory pressure, because:

  1. allocModel reports success to its caller
  2. The model appears to load correctly
  3. But the worst-case prompt graph was never reserved
  4. Subsequent inference may fail in unexpected ways

The Fix

Changed return nil to return err on line 1234.

Testing

This is a straightforward bug fix - the error was always computed but just not returned. The fix ensures the error is properly propagated to the caller.

Fixes #14839


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/14866 **Author:** [@LincolnBurrows2017](https://github.com/LincolnBurrows2017) **Created:** 3/15/2026 **Status:** ❌ Closed **Base:** `main` ← **Head:** `fix/allocmodel-error-handling` --- ### 📝 Commits (1) - [`83b1d93`](https://github.com/ollama/ollama/commit/83b1d935a9b4509d062a07b4fb15f57ab42e6746) fix: properly propagate error from reserveWorstCaseGraph in allocModel ### 📊 Changes **1 file changed** (+1 additions, -1 deletions) <details> <summary>View changed files</summary> 📝 `runner/ollamarunner/runner.go` (+1 -1) </details> ### 📄 Description ## Description The `allocModel()` function in `runner/ollamarunner/runner.go` was silently discarding errors from `reserveWorstCaseGraph(true)` by returning `nil` instead of `err`. ## The Bug ```go err = s.reserveWorstCaseGraph(true) if err != nil { return nil // BUG: should be return err } ``` This could cause issues when prompt graph reservation fails due to memory pressure, because: 1. `allocModel` reports success to its caller 2. The model appears to load correctly 3. But the worst-case prompt graph was never reserved 4. Subsequent inference may fail in unexpected ways ## The Fix Changed `return nil` to `return err` on line 1234. ## Testing This is a straightforward bug fix - the error was always computed but just not returned. The fix ensures the error is properly propagated to the caller. Fixes #14839 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-05-05 09:51:55 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#77177