[GH-ISSUE #1246] Status endpoint needed #26396

Open
opened 2026-04-22 02:39:49 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @ex3ndr on GitHub (Nov 22, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1246

Hello!

I found a non-urgent issues in the API that makes UX much worse when working with models from web or with remote servers because we can't see current state of a ollama: is it downloading model? did it fail downloading model? is it doing inference? how much RAM/VRAM is used? Also lack of such status endpoint it is not clear what to do if connection was aborted during pull - how to check the status of pull operation?

Lack of this endpoint ends up in a weird UI in most projects i have seen so far.

Originally created by @ex3ndr on GitHub (Nov 22, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1246 Hello! I found a non-urgent issues in the API that makes UX much worse when working with models from web or with remote servers because we can't see current state of a ollama: is it downloading model? did it fail downloading model? is it doing inference? how much RAM/VRAM is used? Also lack of such status endpoint it is not clear what to do if connection was aborted during pull - how to check the status of pull operation? Lack of this endpoint ends up in a weird UI in most projects i have seen so far.
GiteaMirror added the feature requestapi labels 2026-04-22 02:39:49 -05:00
Author
Owner

@paulrobello commented on GitHub (Jul 12, 2024):

I agree, at the very least an endpoint that provides the same info as:
ollama ps

<!-- gh-comment-id:2224598734 --> @paulrobello commented on GitHub (Jul 12, 2024): I agree, at the very least an endpoint that provides the same info as: ollama ps
Author
Owner

@dhiltgen commented on GitHub (Oct 23, 2024):

A few of these are covered by #7262

<!-- gh-comment-id:2432946447 --> @dhiltgen commented on GitHub (Oct 23, 2024): A few of these are covered by #7262
Author
Owner

@MadDenker commented on GitHub (Feb 16, 2026):

Agree with this. The only way I know the local model is working is by checking Task Manager or htop. I like the simple, clean UI, but some simple status information on the status bar would be very helpful.

<!-- gh-comment-id:3909204892 --> @MadDenker commented on GitHub (Feb 16, 2026): Agree with this. The only way I know the local model is working is by checking Task Manager or htop. I like the simple, clean UI, but some simple status information on the status bar would be very helpful.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26396