[GH-ISSUE #4764] ollama stop [id of running model] #65037

Closed
opened 2026-05-03 19:35:55 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @mrdev023 on GitHub (Jun 1, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4764

Sometimes, when i use external tools with ollama, some model continue to run after application exit.

It can be usefull to have a command like this

ollama stop [id of running model]
Originally created by @mrdev023 on GitHub (Jun 1, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4764 Sometimes, when i use external tools with ollama, some model continue to run after application exit. It can be usefull to have a command like this ```bash ollama stop [id of running model] ```
GiteaMirror added the feature request label 2026-05-03 19:35:55 -05:00
Author
Owner

@orctom commented on GitHub (Aug 22, 2024):

image
Really need this feature, don't tell me to egrep ollama then kill -9, it's silly, since these's no way to tell which is the pid of the target model.

<!-- gh-comment-id:2303626576 --> @orctom commented on GitHub (Aug 22, 2024): ![image](https://github.com/user-attachments/assets/e5e321b1-ca51-4fa8-86cd-7e01fc389b58) Really need this feature, don't tell me to `egrep ollama` then `kill -9`, it's silly, since these's no way to tell which is the pid of the target model.
Author
Owner

@HomunMage commented on GitHub (Sep 2, 2024):

we need ollama stop that can kill ollama server, no using systemctl

because need use this in such python and c++ subprocess or thread to handle

<!-- gh-comment-id:2323856505 --> @HomunMage commented on GitHub (Sep 2, 2024): we need ollama stop that can kill ollama server, no using systemctl because need use this in such python and c++ subprocess or thread to handle
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#65037