[GH-ISSUE #5253] Add queue position indicator #29048

Open
opened 2026-04-22 07:41:32 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @uzumakinaruto19 on GitHub (Jun 24, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5253

Originally assigned to: @dhiltgen on GitHub.

Currently, when running resource-intensive models on Ollama, especially on less powerful hardware, it's not clear how long processing might take or if there's a queue of tasks.

Feature request:

  1. Implement a way to show the user's position in the processing queue (if any). This is my main concern
  2. Add an option to display estimated time until processing begins or completes.

This feature would be beneficial for:

  • Users running large models on consumer-grade hardware
  • Understanding and managing processing times
  • Improving user experience by providing more information about task status

Possible implementation ideas:

  • Add a new command like ollama status to show current queue position and estimates
  • Include this information in verbose output modes
  • Optionally display this info in the command-line interface during model runs
  • it will be great if we can get that from API call or some thing

Has this been considered before? It would greatly enhance the user experience when working with more demanding models or on systems with limited resources.

Originally created by @uzumakinaruto19 on GitHub (Jun 24, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5253 Originally assigned to: @dhiltgen on GitHub. Currently, when running resource-intensive models on Ollama, especially on less powerful hardware, it's not clear how long processing might take or if there's a queue of tasks. Feature request: 1. Implement a way to show the user's position in the processing queue (if any). This is my main concern 2. Add an option to display estimated time until processing begins or completes. This feature would be beneficial for: - Users running large models on consumer-grade hardware - Understanding and managing processing times - Improving user experience by providing more information about task status Possible implementation ideas: - Add a new command like `ollama status` to show current queue position and estimates - Include this information in verbose output modes - Optionally display this info in the command-line interface during model runs - it will be great if we can get that from API call or some thing Has this been considered before? It would greatly enhance the user experience when working with more demanding models or on systems with limited resources.
GiteaMirror added the feature request label 2026-04-22 07:41:32 -05:00
Author
Owner

@DustyPullRequest commented on GitHub (Apr 21, 2025):

This would be so useful, especially for tools that parallelize by using different models for different tasks.

<!-- gh-comment-id:2819062976 --> @DustyPullRequest commented on GitHub (Apr 21, 2025): This would be so useful, especially for tools that parallelize by using different models for different tasks.
Author
Owner

@sjsone commented on GitHub (Aug 2, 2025):

@dhiltgen would you mind if I try working on this? I have a similar use-case to @uzumakinaruto19 and I think that it would be a good "first issue" for me to work on.

<!-- gh-comment-id:3146733880 --> @sjsone commented on GitHub (Aug 2, 2025): @dhiltgen would you mind if I try working on this? I have a similar use-case to @uzumakinaruto19 and I think that it would be a good "first issue" for me to work on.
Author
Owner

@Thf772 commented on GitHub (Mar 14, 2026):

This could be similar/related to #2004, which is more about information on the whole queue

<!-- gh-comment-id:4060845959 --> @Thf772 commented on GitHub (Mar 14, 2026): This could be similar/related to #2004, which is more about information on the whole queue
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29048