[GH-ISSUE #3182] Add "Stop" command #48472

Closed
opened 2026-04-28 08:34:00 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @haydonryan on GitHub (Mar 16, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3182

What are you trying to do?

ollama is great! There is a ollama serve / start, however it doesn't have stop. There's already a big (closed) issue on how to stop it from autostarting on reboot, and it's OS dependent. If you can create the service with the ollama cli, then you should be able to stop the service / disable the service with the CLI. Personally I don't want to run the service all the time, unless i'm utilizing it.

How should we solve this?

add an ollama stop command

What is the impact of not solving this?

there's going to continue to be people asking how to stop the service in issues,

Anything else?

Nope - thanks for a great program!

Originally created by @haydonryan on GitHub (Mar 16, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3182 ### What are you trying to do? ollama is great! There is a ollama serve / start, however it doesn't have stop. There's already a big (closed) issue on how to stop it from autostarting on reboot, and it's OS dependent. If you can create the service with the ollama cli, then you should be able to stop the service / disable the service with the CLI. Personally I don't want to run the service all the time, unless i'm utilizing it. ### How should we solve this? add an ollama stop command ### What is the impact of not solving this? there's going to continue to be people asking how to stop the service in issues, ### Anything else? Nope - thanks for a great program!
GiteaMirror added the feature request label 2026-04-28 08:34:00 -05:00
Author
Owner

@darrenangle commented on GitHub (Mar 16, 2024):

+1 please

<!-- gh-comment-id:2002120176 --> @darrenangle commented on GitHub (Mar 16, 2024): +1 please
Author
Owner

@wang-yiwei commented on GitHub (Mar 20, 2024):

Just for linux users, if you don't wanna ollama to auto-allocate your memory (on RAM or VRAM), you can use the systemctl command to manually turn on/ off the ollama service.

sudo systemctl status ollama    # check the status of the service
sudo systemctl stop ollama     # kill
sudo systemctl start ollama   # restart

Possible use scenario:

  • I use cody extension with ollama as the backend provider. Ollama will load llm weights into my hardware if I am coding.
  • I am programming with vscode but also testing an llm-API that I have deployed on the same machine
  • I don't need code completion service for now
    If you are facing the same condition mentioned above, or you simply don't wanna ollama out of control, just stop the systemctl service.
<!-- gh-comment-id:2009928347 --> @wang-yiwei commented on GitHub (Mar 20, 2024): Just for linux users, if you don't wanna ollama to auto-allocate your memory (on RAM or VRAM), you can use the `systemctl` command to manually turn on/ off the ollama service. ```sh sudo systemctl status ollama # check the status of the service sudo systemctl stop ollama # kill sudo systemctl start ollama # restart ``` Possible use scenario: - I use `cody` extension with ollama as the backend provider. Ollama will load llm weights into my hardware if I am coding. - I am programming with vscode but also testing an llm-API that I have deployed on the same machine - I don't need code completion service for now If you are facing the same condition mentioned above, or you simply don't wanna ollama out of control, just stop the systemctl service.
Author
Owner

@FilkerZero commented on GitHub (Mar 23, 2024):

I just added a similar request; I missed this the first time I scanned through open issues.

<!-- gh-comment-id:2016565976 --> @FilkerZero commented on GitHub (Mar 23, 2024): I just added a similar request; I missed this the first time I scanned through open issues.
Author
Owner

@oldgithubman commented on GitHub (Mar 30, 2024):

Seems strange this wasn't included in the first place...

<!-- gh-comment-id:2027908464 --> @oldgithubman commented on GitHub (Mar 30, 2024): Seems strange this wasn't included in the first place...
Author
Owner

@lonngxiang commented on GitHub (Apr 12, 2024):

+1

<!-- gh-comment-id:2050832101 --> @lonngxiang commented on GitHub (Apr 12, 2024): +1
Author
Owner

@pdevine commented on GitHub (May 18, 2024):

Thanks for the issue @haydonryan . I'm going to close this in favor of #3314 since it's slightly more comprehensive.

<!-- gh-comment-id:2118619021 --> @pdevine commented on GitHub (May 18, 2024): Thanks for the issue @haydonryan . I'm going to close this in favor of #3314 since it's slightly more comprehensive.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48472