[GH-ISSUE #12764] A way to quickly stop models via Ollama's system tray symbol in Windows #34225

Open
opened 2026-04-22 17:38:26 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @NeoTiger on GitHub (Oct 24, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12764

I often need to manually stop a running model when I need the full VRAM for another application like a game or when switching to a different model (e.g. when switching from a chat to a stable diffusion model)

Right now I have to do that via a text command in PowerShell CLI. But it's a bit awkward with having to type some model's longer names, especially when they were pulled from huggingface.

I'd love to have a feature in the Ollama's sytem try symbol that I can just use its context menu to select and stop specific (or all) models from there.

Originally created by @NeoTiger on GitHub (Oct 24, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12764 I often need to manually stop a running model when I need the full VRAM for another application like a game or when switching to a different model (e.g. when switching from a chat to a stable diffusion model) Right now I have to do that via a text command in PowerShell CLI. But it's a bit awkward with having to type some model's longer names, especially when they were pulled from huggingface. I'd love to have a feature in the Ollama's sytem try symbol that I can just use its context menu to select and stop specific (or all) models from there.
GiteaMirror added the feature request label 2026-04-22 17:38:26 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 24, 2025):

Not exactly what you asked for, but you could create a powershell script that stops all models and add it to the taskbar. Then it's just a click to unload the models.

<!-- gh-comment-id:3442581483 --> @rick-github commented on GitHub (Oct 24, 2025): Not exactly what you asked for, but you could create a powershell script that stops all models and add it to the taskbar. Then it's just a click to unload the models.
Author
Owner

@rick-github commented on GitHub (Oct 24, 2025):

$ollamaHost = $env:OLLAMA_HOST
if ([string]::IsNullOrWhiteSpace($ollamaHost)) {
  $ollamaHost = "http://localhost:11434"
}
$ps = Invoke-RestMethod -Uri "${ollamaHost}/api/ps" -Method Get -ErrorAction Stop
foreach ($model in $ps.models | ForEach-Object { $_.model }) {
  $requestBody = @{
    model = $model
    keep_alive = 0
  } | ConvertTo-Json
  $_ = Invoke-RestMethod -Uri "${ollamaHost}/api/generate" -Method Post -Body $requestBody -ErrorAction Stop
}
<!-- gh-comment-id:3442595119 --> @rick-github commented on GitHub (Oct 24, 2025): ```powershell $ollamaHost = $env:OLLAMA_HOST if ([string]::IsNullOrWhiteSpace($ollamaHost)) { $ollamaHost = "http://localhost:11434" } $ps = Invoke-RestMethod -Uri "${ollamaHost}/api/ps" -Method Get -ErrorAction Stop foreach ($model in $ps.models | ForEach-Object { $_.model }) { $requestBody = @{ model = $model keep_alive = 0 } | ConvertTo-Json $_ = Invoke-RestMethod -Uri "${ollamaHost}/api/generate" -Method Post -Body $requestBody -ErrorAction Stop } ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#34225