[GH-ISSUE #6078] Refactor num_parallel tracking in scheduler #65836

Open
opened 2026-05-03 22:51:15 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @dhiltgen on GitHub (Jul 30, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6078

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

The current code passes a pointer and mutates it as we try to determine what the optimal parallel setting is, which makes the code hard to follow.

OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @dhiltgen on GitHub (Jul 30, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6078 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? The current code passes a pointer and mutates it as we try to determine what the optimal parallel setting is, which makes the code hard to follow. ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-05-03 22:51:15 -05:00
Author
Owner

@Icelain commented on GitHub (Aug 3, 2024):

Possibly a more readable approach would be to instead pass in numparallel to pickBestFullFitByLibrary and pickBestPartialFitByLibrary and have them return it back.

For example:
g, numParallel := pickBestFullFitByLibrary(pending, ggml, gpus, numParallel)
,
gpus, numParallel = pickBestPartialFitByLibrary(pending, ggml, gpus, numParallel)

<!-- gh-comment-id:2266685778 --> @Icelain commented on GitHub (Aug 3, 2024): Possibly a more readable approach would be to instead pass in numparallel to `pickBestFullFitByLibrary` and `pickBestPartialFitByLibrary` and have them return it back. For example: ```g, numParallel := pickBestFullFitByLibrary(pending, ggml, gpus, numParallel)``` , ```gpus, numParallel = pickBestPartialFitByLibrary(pending, ggml, gpus, numParallel)```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#65836