[GH-ISSUE #690] Stop Ollama #315

Closed
opened 2026-04-12 09:51:54 -05:00 by GiteaMirror · 62 comments
Owner

Originally created by @mora-phi on GitHub (Oct 3, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/690

Hi,
How can I stop Ollama ?
If I run with "ollama run llama2" for instance and then quit with "Ctrl-C", then go to http://127.0.0.1:11434/ in a browser, it shows "Ollama is running"
When I kill the running process with a kill -9, a new process is instantly spawned.
Therefore I don't know how to totally stop Ollama...
(I'm on macos)

Originally created by @mora-phi on GitHub (Oct 3, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/690 Hi, How can I stop Ollama ? If I run with "ollama run llama2" for instance and then quit with "Ctrl-C", then go to http://127.0.0.1:11434/ in a browser, it shows "Ollama is running" When I kill the running process with a kill -9, a new process is instantly spawned. Therefore I don't know how to totally stop Ollama... (I'm on macos)
Author
Owner

@BruceMacD commented on GitHub (Oct 3, 2023):

Are you using the Ollama Mac app? If so just exiting the toolbar app will stop the server. The Mac app will restart the server also, if left open.

Here:
image

Otherwise, in a terminal:

$ pgrep ollama
74877
$ kill 74877

or to stop the Mac App:

osascript -e 'tell app "Ollama" to quit'
<!-- gh-comment-id:1745473655 --> @BruceMacD commented on GitHub (Oct 3, 2023): Are you using the Ollama Mac app? If so just exiting the toolbar app will stop the server. The Mac app will restart the server also, if left open. Here: ![image](https://github.com/jmorganca/ollama/assets/5853428/8eaaffa4-114c-4bea-98a6-05e72367c13d) Otherwise, in a terminal: ``` $ pgrep ollama 74877 $ kill 74877 ``` or to stop the Mac App: ``` osascript -e 'tell app "Ollama" to quit' ```
Author
Owner

@mora-phi commented on GitHub (Oct 4, 2023):

Thanks a lot, I didn't check in the upper toolbar, my bad.
Indeed, closing properly from there stopped spawning new processes.
Thanks again :-)

<!-- gh-comment-id:1746265119 --> @mora-phi commented on GitHub (Oct 4, 2023): Thanks a lot, I didn't check in the upper toolbar, my bad. Indeed, closing properly from there stopped spawning new processes. Thanks again :-)
Author
Owner

@dongyuwei commented on GitHub (Oct 16, 2023):

The bug not fixed, can't kill it on Mac.
ollama version 0.1.1

<!-- gh-comment-id:1763585453 --> @dongyuwei commented on GitHub (Oct 16, 2023): The bug not fixed, can't kill it on Mac. ollama version 0.1.1
Author
Owner

@BruceMacD commented on GitHub (Oct 16, 2023):

@dongyuwei have you exited the mac app from the toolbar?

<!-- gh-comment-id:1764599669 --> @BruceMacD commented on GitHub (Oct 16, 2023): @dongyuwei have you exited the mac app from the toolbar?
Author
Owner

@dongyuwei commented on GitHub (Oct 17, 2023):

Yes. @BruceMacD
I logout my system then re-login, no more ollma.

<!-- gh-comment-id:1765712507 --> @dongyuwei commented on GitHub (Oct 17, 2023): Yes. @BruceMacD I logout my system then re-login, no more ollma.
Author
Owner

@jwandekoken commented on GitHub (Oct 18, 2023):

This is happening on Linux too. After I issue the command ollama run model, and after I close the terminal with ctrl + D, the ollama instance keeps running. If I kill it, it just respawn.

Edit: in my case, even after restarting the system, the program keeps re-opening

<!-- gh-comment-id:1768209893 --> @jwandekoken commented on GitHub (Oct 18, 2023): This is happening on Linux too. After I issue the command `ollama run model`, and after I close the terminal with `ctrl + D`, the ollama instance keeps running. If I kill it, it just respawn. Edit: in my case, even after restarting the system, the program keeps re-opening
Author
Owner

@BruceMacD commented on GitHub (Oct 18, 2023):

@jwandekoken On Linux Ollama is running on as a systemd service. You can stop it using systemctl.

$ systemctl stop ollama.service
<!-- gh-comment-id:1768620663 --> @BruceMacD commented on GitHub (Oct 18, 2023): @jwandekoken On Linux Ollama is running on as a systemd service. You can stop it using `systemctl`. ``` $ systemctl stop ollama.service ```
Author
Owner

@devhims commented on GitHub (Nov 2, 2023):

Are you using the Ollama Mac app? If so just exiting the toolbar app will stop the server. The Mac app will restart the server also, if left open.

Here: image

Otherwise, in a terminal:

$ pgrep ollama
74877
$ kill 74877

Thanks! This worked on MacOS.

<!-- gh-comment-id:1791397823 --> @devhims commented on GitHub (Nov 2, 2023): > Are you using the Ollama Mac app? If so just exiting the toolbar app will stop the server. The Mac app will restart the server also, if left open. > > Here: ![image](https://user-images.githubusercontent.com/5853428/272356926-8eaaffa4-114c-4bea-98a6-05e72367c13d.png) > > Otherwise, in a terminal: > > ``` > $ pgrep ollama > 74877 > $ kill 74877 > ``` Thanks! This worked on MacOS.
Author
Owner

@JermellB commented on GitHub (Nov 4, 2023):

sudo service ollama stop worked for me on ubuntu

<!-- gh-comment-id:1793541075 --> @JermellB commented on GitHub (Nov 4, 2023): `sudo service ollama stop` worked for me on ubuntu
Author
Owner

@devwaseem commented on GitHub (Nov 26, 2023):

For Mac:
pkill ollama works

<!-- gh-comment-id:1826725322 --> @devwaseem commented on GitHub (Nov 26, 2023): For Mac: ```pkill ollama``` works
Author
Owner

@airtonix commented on GitHub (Dec 12, 2023):

systemctl stop ollama.service

Should this not be made obvious by an abstraction?

ollama server start --system

# prompts for sudo to create system.d unit if it doesn't exist and start it

ollama server stop --system

# again, sudo required.

ollama server start

# no sudo required

# user presses: ctrl + d

# server stops.
<!-- gh-comment-id:1852844736 --> @airtonix commented on GitHub (Dec 12, 2023): > systemctl stop ollama.service Should this not be made obvious by an abstraction? ``` ollama server start --system # prompts for sudo to create system.d unit if it doesn't exist and start it ollama server stop --system # again, sudo required. ollama server start # no sudo required # user presses: ctrl + d # server stops. ```
Author
Owner

@shayneoneill commented on GitHub (Jan 11, 2024):

This really is not a great behavior. Thanks to the notch on the M1/M2 laptops, the llama icon gets obscured so its not obvious at all its running nor how to access that toolbar item. Is there some sort of config option to just forbid the system from putting things there and using command line start/stop instead? Can I just remove the service entirely?

pkill ollama does NOT solve the problem btw as it somewhat disobediently just restarts it.

Also I have no idea what systemctl is, nor does my mac :(

<!-- gh-comment-id:1886879276 --> @shayneoneill commented on GitHub (Jan 11, 2024): This really is not a great behavior. Thanks to the notch on the M1/M2 laptops, the llama icon gets obscured so its not obvious at all its running nor how to access that toolbar item. Is there some sort of config option to just forbid the system from putting things there and using command line start/stop instead? Can I just remove the service entirely? pkill ollama does NOT solve the problem btw as it somewhat disobediently just restarts it. Also I have *no* idea what systemctl is, nor does my mac :(
Author
Owner

@tmceld commented on GitHub (Jan 12, 2024):

also having this problem on mac - its def. running, but i don't have anything in the taskbar and pgrep/kill just causes a restart

<!-- gh-comment-id:1888182744 --> @tmceld commented on GitHub (Jan 12, 2024): also having this problem on mac - its def. running, but i don't have anything in the taskbar and pgrep/kill just causes a restart
Author
Owner

@marhar commented on GitHub (Feb 7, 2024):

I've been having similar problems on my mac... server is running, I don't see anything on my taskbar to stop it, and I can't install the new ollama since the process is running. pkill/pgrep etc as per other users, killing the process just restarts it.

I worked around by rm -rf /Applications/Ollama; pkill ollama which caused an error. Following that, ollama process was not running, and I could install the new Ollama.app into /Applications.

<!-- gh-comment-id:1932152943 --> @marhar commented on GitHub (Feb 7, 2024): I've been having similar problems on my mac... server is running, I don't see anything on my taskbar to stop it, and I can't install the new ollama since the process is running. pkill/pgrep etc as per other users, killing the process just restarts it. I worked around by `rm -rf /Applications/Ollama; pkill ollama` which caused an error. Following that, ollama process was not running, and I could install the new Ollama.app into /Applications.
Author
Owner

@MehrCurry commented on GitHub (Feb 15, 2024):

rm -rf /Applications/Ollama; pkill ollama

It has to be rm -rf /Applications/Ollama.app; pkill ollama

<!-- gh-comment-id:1945332881 --> @MehrCurry commented on GitHub (Feb 15, 2024): > rm -rf /Applications/Ollama; pkill ollama It has to be `rm -rf /Applications/Ollama.app; pkill ollama`
Author
Owner

@AnirudhaGohokar commented on GitHub (Feb 16, 2024):

How do i stop on windows

<!-- gh-comment-id:1948999669 --> @AnirudhaGohokar commented on GitHub (Feb 16, 2024): How do i stop on windows
Author
Owner

@keebOo commented on GitHub (Feb 19, 2024):

In the Mac terminal, I am attempting to check if there is an active service using the command:

lsof -i :11434

This is to verify if anything is running on the ollama standard port.

Following suggestions from other users, I then execute also:

pgrep ollama

After this, if the lsof command shows any process, I use the kill command followed by the PID to terminate the service (pgrep show eventually all the ollama process)

<!-- gh-comment-id:1952807931 --> @keebOo commented on GitHub (Feb 19, 2024): In the Mac terminal, I am attempting to check if there is an active service using the command: `lsof -i :11434` This is to verify if anything is running on the ollama standard port. Following suggestions from other users, I then execute also: `pgrep ollama` After this, if the `lsof` command shows any process, I use the `kill` command followed by the PID to terminate the service (pgrep show eventually all the ollama process)
Author
Owner

@HazemElAgaty commented on GitHub (Mar 3, 2024):

I was only able to kill the process using the activity monitor.

<!-- gh-comment-id:1974954479 --> @HazemElAgaty commented on GitHub (Mar 3, 2024): I was only able to kill the process using the activity monitor.
Author
Owner

@skytodmoon commented on GitHub (Mar 5, 2024):

I use 'systemctl stop ollama.service' to stop the ollama
But when I use 'systemctl start ollama.service', the default port is bind again.
How to stop the service really?

<!-- gh-comment-id:1977919126 --> @skytodmoon commented on GitHub (Mar 5, 2024): I use 'systemctl stop ollama.service' to stop the ollama But when I use 'systemctl start ollama.service', the default port is bind again. How to stop the service really?
Author
Owner

@techkanna commented on GitHub (Mar 14, 2024):

How do i stop on windows??

<!-- gh-comment-id:1997324675 --> @techkanna commented on GitHub (Mar 14, 2024): How do i stop on windows??
Author
Owner

@sohaibsoussi commented on GitHub (Mar 14, 2024):

How do i stop on windows??

Use this commande in your powershell : Get-Process | Where-Object {$_.ProcessName -like '*ollama*'} | Stop-Process

<!-- gh-comment-id:1998454215 --> @sohaibsoussi commented on GitHub (Mar 14, 2024): > How do i stop on windows?? Use this commande in your powershell : ` Get-Process | Where-Object {$_.ProcessName -like '*ollama*'} | Stop-Process`
Author
Owner

@hpsaturn commented on GitHub (Apr 7, 2024):

In my Debian I can reproduce it. Nothing works: kill, service stop or systemctl fails trying to stop the service, and also produce conflicts with my amdgpu driver: #3527

<!-- gh-comment-id:2041576612 --> @hpsaturn commented on GitHub (Apr 7, 2024): In my Debian I can reproduce it. Nothing works: `kill`, `service stop` or `systemctl` fails trying to stop the service, and also produce conflicts with my `amdgpu` driver: #3527
Author
Owner

@tamashalasi commented on GitHub (Apr 14, 2024):

Not a good practice, but pkill -9 ollama works for now on Arch Linux (probably other distros as well). Haven't experienced any file corruption so far, but be careful with it.

<!-- gh-comment-id:2054193384 --> @tamashalasi commented on GitHub (Apr 14, 2024): Not a good practice, but `pkill -9 ollama` works for now on Arch Linux (probably other distros as well). Haven't experienced any file corruption so far, but be careful with it.
Author
Owner

@darkBuddha commented on GitHub (Apr 17, 2024):

same on Debian 12

<!-- gh-comment-id:2060653178 --> @darkBuddha commented on GitHub (Apr 17, 2024): same on Debian 12
Author
Owner

@hpsaturn commented on GitHub (Apr 19, 2024):

re-open the issue? maybe yes..

<!-- gh-comment-id:2066998223 --> @hpsaturn commented on GitHub (Apr 19, 2024): re-open the issue? maybe yes..
Author
Owner

@gros-chat commented on GitHub (Apr 20, 2024):

rm -rf /Applications/Ollama; pkill ollama

It has to be rm -rf /Applications/Ollama.app; pkill ollama

it worked for me (Mac)

<!-- gh-comment-id:2067612426 --> @gros-chat commented on GitHub (Apr 20, 2024): > > rm -rf /Applications/Ollama; pkill ollama > > It has to be `rm -rf /Applications/Ollama.app; pkill ollama` it worked for me (Mac)
Author
Owner

@lselector commented on GitHub (Apr 20, 2024):

This works on Mac: sudo pkill -9 ollama Ollama

Then double-click on Ollama in Applications to start it

<!-- gh-comment-id:2067810063 --> @lselector commented on GitHub (Apr 20, 2024): This works on Mac: `sudo pkill -9 ollama Ollama` Then double-click on Ollama in Applications to start it
Author
Owner

@lzhhh93 commented on GitHub (May 1, 2024):

@

How do i stop on windows??

Use this commande in your powershell : Get-Process | Where-Object {$_.ProcessName -like '*ollama*'} | Stop-Process

On my side, this command seems to only kill the ollama server but does not release RAM. RAM needs to be released manually in task manager.

<!-- gh-comment-id:2088325952 --> @lzhhh93 commented on GitHub (May 1, 2024): @ > > How do i stop on windows?? > > Use this commande in your powershell : ` Get-Process | Where-Object {$_.ProcessName -like '*ollama*'} | Stop-Process` On my side, this command seems to only kill the ollama server but does not release RAM. RAM needs to be released manually in task manager.
Author
Owner

@alby13 commented on GitHub (May 2, 2024):

@jwandekoken On Linux Ollama is running on as a systemd service. You can stop it using systemctl.

$ systemctl stop ollama.service

We used systemctl and we noticed that ollama was running in the background.

We ran this command to stop the process and disable the auto-starting of the ollama server, and we can restart it manually at anytime.
To start it manually, we use this command: sudo systemctl start ollama.service

However, we noticed that once we restarted the ollama.service and then reboot the machine, the process gets added to the auto-start again.

So what we did was we stop the process, and then disable it every time. This prevents it from automatically starting when Linux is started. The commands are:

sudo systemctl stop ollama.service
sudo systemctl disable ollama.service

Thank you for the original information in your post.

<!-- gh-comment-id:2089732777 --> @alby13 commented on GitHub (May 2, 2024): > @jwandekoken On Linux Ollama is running on as a systemd service. You can stop it using `systemctl`. > > ``` > $ systemctl stop ollama.service > ``` We used systemctl and we noticed that ollama was running in the background. We ran this command to stop the process and disable the auto-starting of the ollama server, and we can restart it manually at anytime. To start it manually, we use this command: sudo systemctl start ollama.service However, we noticed that once we restarted the ollama.service and then reboot the machine, the process gets added to the auto-start again. So what we did was we stop the process, and then disable it every time. This prevents it from automatically starting when Linux is started. The commands are: sudo systemctl stop ollama.service sudo systemctl disable ollama.service Thank you for the original information in your post.
Author
Owner

@p1r473 commented on GitHub (May 3, 2024):

Add an execstop for your systemd file as there isnt one in the documentation

[Service]
ExecStop=/bin/kill -TERM $MAINPID

Ive also gone ahead and added another customization

[Service]
Environment="OLLAMA_MODELS=/home/pi/ollama-models/"
<!-- gh-comment-id:2093711867 --> @p1r473 commented on GitHub (May 3, 2024): Add an execstop for your systemd file as there isnt one in the [documentation](https://github.com/ollama/ollama/blob/main/docs/linux.md) ``` [Service] ExecStop=/bin/kill -TERM $MAINPID ``` Ive also gone ahead and added another customization ``` [Service] Environment="OLLAMA_MODELS=/home/pi/ollama-models/" ```
Author
Owner

@joliss commented on GitHub (May 26, 2024):

On Mac, this problem seems to be fixed as of a few releases ago (currently on 0.1.38). 👍 Quitting the Ollama app in the menu bar, or alternatively running killall Ollama ollama, reliably kills the Ollama process now, and it doesn't respawn.

<!-- gh-comment-id:2132171284 --> @joliss commented on GitHub (May 26, 2024): On Mac, this problem seems to be fixed as of a few releases ago (currently on 0.1.38). 👍 Quitting the Ollama app in the menu bar, or alternatively running `killall Ollama ollama`, reliably kills the Ollama process now, and it doesn't respawn.
Author
Owner

@Eliyahou commented on GitHub (May 27, 2024):

in Windows the command is
Get-Process | Where-Object {$_.ProcessName -like 'ollama'} | kill

<!-- gh-comment-id:2133535941 --> @Eliyahou commented on GitHub (May 27, 2024): in Windows the command is Get-Process | Where-Object {$_.ProcessName -like '*ollama*'} | kill
Author
Owner

@godefroi commented on GitHub (Jun 6, 2024):

RAM needs to be released manually in task manager.

@lzhhh93 what does this even mean? A process, when stopped, does not consume any memory, and "task manager" does not have any functionality for "manually" "releasing" memory.

<!-- gh-comment-id:2153158169 --> @godefroi commented on GitHub (Jun 6, 2024): > RAM needs to be released manually in task manager. @lzhhh93 what does this even mean? A process, when stopped, does not consume any memory, and "task manager" does not have any functionality for "manually" "releasing" memory.
Author
Owner

@pawliczka commented on GitHub (Jun 10, 2024):

RAM needs to be released manually in task manager.

@lzhhh93 what does this even mean? A process, when stopped, does not consume any memory, and "task manager" does not have any functionality for "manually" "releasing" memory.

I have the same problem. When you TerminateProcess ollama.exe on Windows ollama_llama_server.exe is not terminated.

<!-- gh-comment-id:2158406729 --> @pawliczka commented on GitHub (Jun 10, 2024): > > RAM needs to be released manually in task manager. > > @lzhhh93 what does this even mean? A process, when stopped, does not consume any memory, and "task manager" does not have any functionality for "manually" "releasing" memory. I have the same problem. When you TerminateProcess ollama.exe on Windows ollama_llama_server.exe is not terminated.
Author
Owner

@Skarian commented on GitHub (Jun 13, 2024):

RAM needs to be released manually in task manager.

@lzhhh93 what does this even mean? A process, when stopped, does not consume any memory, and "task manager" does not have any functionality for "manually" "releasing" memory.

I have the same problem. When you TerminateProcess ollama.exe on Windows ollama_llama_server.exe is not terminated.

I am having this exact same issue. Am able to end ollama.exe but the runners stay running and using RAM seemingly perpetually. Im using the CLI version of ollama on Windows.

<!-- gh-comment-id:2165489524 --> @Skarian commented on GitHub (Jun 13, 2024): > > > RAM needs to be released manually in task manager. > > > > > > @lzhhh93 what does this even mean? A process, when stopped, does not consume any memory, and "task manager" does not have any functionality for "manually" "releasing" memory. > > I have the same problem. When you TerminateProcess ollama.exe on Windows ollama_llama_server.exe is not terminated. I am having this exact same issue. Am able to end ollama.exe but the runners stay running and using RAM seemingly perpetually. Im using the CLI version of ollama on Windows.
Author
Owner

@kleer001 commented on GitHub (Jul 15, 2024):

I too would appreciate the syntatic sugar of an 'ollama server_stop' command.

<!-- gh-comment-id:2229200518 --> @kleer001 commented on GitHub (Jul 15, 2024): I too would appreciate the syntatic sugar of an 'ollama server_stop' command.
Author
Owner

@Vimiso commented on GitHub (Jul 23, 2024):

I found on mac pkill ollama doesn't work because the main process actually starts with a capital.

pkill Ollama should do it.

<!-- gh-comment-id:2246337225 --> @Vimiso commented on GitHub (Jul 23, 2024): I found on mac `pkill ollama` doesn't work because the main process actually starts with a capital. `pkill Ollama` should do it.
Author
Owner

@dkgaraujo commented on GitHub (Jul 29, 2024):

Fresh install on an ubuntu. Killing it does not work for me. In fact, even killing it by PID is hard because it spawns a new process (with a new PID = old PID + 2) every second or so. I'm disappointed in this behaviour. I would even understand that they want to create a persistent server experience but this is not (to my knowledge) good practice to do this, and neither is the absence of a syntactic sugar to stop it properly. Relatedly, I don't know why this issue is closed.

<!-- gh-comment-id:2255289021 --> @dkgaraujo commented on GitHub (Jul 29, 2024): Fresh install on an ubuntu. Killing it does not work for me. In fact, even killing it by PID is hard because it spawns a new process (with a new PID = old PID + 2) every second or so. I'm disappointed in this behaviour. I would even understand that they want to create a persistent server experience but this is not (to my knowledge) good practice to do this, and neither is the absence of a syntactic sugar to stop it properly. Relatedly, I don't know why this issue is closed.
Author
Owner

@cedricferry commented on GitHub (Jul 29, 2024):

if you use Homebrew:

$ brew services stop ollama
<!-- gh-comment-id:2257191133 --> @cedricferry commented on GitHub (Jul 29, 2024): if you use Homebrew: ``` $ brew services stop ollama ```
Author
Owner

@AledHe commented on GitHub (Aug 17, 2024):

@jwandekoken On Linux Ollama is running on as a systemd service. You can stop it using systemctl.

$ systemctl stop ollama.service

We used systemctl and we noticed that ollama was running in the background.

We ran this command to stop the process and disable the auto-starting of the ollama server, and we can restart it manually at anytime. To start it manually, we use this command: sudo systemctl start ollama.service

However, we noticed that once we restarted the ollama.service and then reboot the machine, the process gets added to the auto-start again.

So what we did was we stop the process, and then disable it every time. This prevents it from automatically starting when Linux is started. The commands are:

sudo systemctl stop ollama.service sudo systemctl disable ollama.service

Thank you for the original information in your post.

Very useful method, for an auto script:

#!/bin/bash

check_ollama() {
    pgrep ollama > /dev/null
}

check_autostart() {
    systemctl is-enabled --quiet ollama.service
}

check_service_active() {
    if systemctl is-active --quiet ollama.service; then
        echo "Ollama service is active."
    else
        echo "Ollama service is inactive."
    fi
}

start_service() {
    if check_ollama; then
        echo "Ollama is already running."
    else
        echo "Starting Ollama service..."
        sudo systemctl start ollama.service
        if systemctl is-active --quiet ollama.service; then
            echo "Ollama service started successfully."
        else
            echo "Failed to start Ollama service."
        fi
    fi
}

disable_service() {
    echo "Disabling Ollama service..."
    sudo systemctl disable ollama.service
    if check_ollama; then
        echo "Stopping Ollama service..."
        sudo systemctl stop ollama.service
        echo "Ollama service stopped."
    else
        echo "Ollama is not running."
    fi
    echo "Ollama service disabled."
}

disable_autostart() {
    if check_autostart; then
        echo "Disabling Ollama service auto-start..."
        sudo systemctl disable ollama.service
        echo "Ollama service auto-start disabled."
    else
        echo "Ollama service auto-start is already disabled."
    fi
}

echo "Checking Ollama service status..."
check_service_active
echo

while true; do
    PS3="Select an option: "
    options=("Start Ollama service" "Disable Ollama service" "Disable Ollama auto-start" "Exit")
    select opt in "${options[@]}"
    do
        case $REPLY in
            1) start_service; break ;;
            2) disable_service; break ;;
            3) disable_autostart; break ;;
            4) echo "Exiting."; exit ;;
            *) echo "Invalid option. Please select 1-4." ;;
        esac
    done
done

Please sudo first to run the bash : ))

<!-- gh-comment-id:2294588618 --> @AledHe commented on GitHub (Aug 17, 2024): > > @jwandekoken On Linux Ollama is running on as a systemd service. You can stop it using `systemctl`. > > ``` > > $ systemctl stop ollama.service > > ``` > > We used systemctl and we noticed that ollama was running in the background. > > We ran this command to stop the process and disable the auto-starting of the ollama server, and we can restart it manually at anytime. To start it manually, we use this command: sudo systemctl start ollama.service > > However, we noticed that once we restarted the ollama.service and then reboot the machine, the process gets added to the auto-start again. > > So what we did was we stop the process, and then disable it every time. This prevents it from automatically starting when Linux is started. The commands are: > > sudo systemctl stop ollama.service sudo systemctl disable ollama.service > > Thank you for the original information in your post. Very useful method, for an auto script: ```bash #!/bin/bash check_ollama() { pgrep ollama > /dev/null } check_autostart() { systemctl is-enabled --quiet ollama.service } check_service_active() { if systemctl is-active --quiet ollama.service; then echo "Ollama service is active." else echo "Ollama service is inactive." fi } start_service() { if check_ollama; then echo "Ollama is already running." else echo "Starting Ollama service..." sudo systemctl start ollama.service if systemctl is-active --quiet ollama.service; then echo "Ollama service started successfully." else echo "Failed to start Ollama service." fi fi } disable_service() { echo "Disabling Ollama service..." sudo systemctl disable ollama.service if check_ollama; then echo "Stopping Ollama service..." sudo systemctl stop ollama.service echo "Ollama service stopped." else echo "Ollama is not running." fi echo "Ollama service disabled." } disable_autostart() { if check_autostart; then echo "Disabling Ollama service auto-start..." sudo systemctl disable ollama.service echo "Ollama service auto-start disabled." else echo "Ollama service auto-start is already disabled." fi } echo "Checking Ollama service status..." check_service_active echo while true; do PS3="Select an option: " options=("Start Ollama service" "Disable Ollama service" "Disable Ollama auto-start" "Exit") select opt in "${options[@]}" do case $REPLY in 1) start_service; break ;; 2) disable_service; break ;; 3) disable_autostart; break ;; 4) echo "Exiting."; exit ;; *) echo "Invalid option. Please select 1-4." ;; esac done done ``` Please sudo first to run the bash : ))
Author
Owner

@flashlan commented on GitHub (Aug 19, 2024):

What works for me on Linux is:

killall pt_main_thread
<!-- gh-comment-id:2295585228 --> @flashlan commented on GitHub (Aug 19, 2024): What works for me on Linux is: ```bash killall pt_main_thread ```
Author
Owner

@MrBns commented on GitHub (Aug 20, 2024):

why so many drama to stop Ollama. why not just ollama stop model/name. and stopped that.

<!-- gh-comment-id:2299791388 --> @MrBns commented on GitHub (Aug 20, 2024): why so many drama to stop Ollama. why not just ollama stop model/name. and stopped that.
Author
Owner

@gonzalezea commented on GitHub (Aug 27, 2024):

How do i stop on windows

This works for me in W10

taskkill /fi "imagename eq ollama app.exe"

<!-- gh-comment-id:2313401733 --> @gonzalezea commented on GitHub (Aug 27, 2024): > How do i stop on windows This works for me in W10 `taskkill /fi "imagename eq ollama app.exe"`
Author
Owner

@Mustafanaji0413 commented on GitHub (Sep 4, 2024):

This works on Mac: sudo pkill -9 ollama Ollama

Then double-click on Ollama in Applications to start it

Thus works for mac! THANKS!!

<!-- gh-comment-id:2328866482 --> @Mustafanaji0413 commented on GitHub (Sep 4, 2024): > This works on Mac: `sudo pkill -9 ollama Ollama` > > Then double-click on Ollama in Applications to start it Thus works for mac! THANKS!!
Author
Owner

@gvirus21 commented on GitHub (Sep 10, 2024):

Facing this problem for 30 minutes, Solved it by just force-killing Ollam from Mac's activity monitor.

<!-- gh-comment-id:2341603946 --> @gvirus21 commented on GitHub (Sep 10, 2024): Facing this problem for 30 minutes, Solved it by just force-killing Ollam from Mac's activity monitor.
Author
Owner

@Ranjithdss15 commented on GitHub (Oct 21, 2024):

For Mac:

Close the ollama process from the Activity Monitor and avoid confirming it again with ollama ps from your terminal, as this will restart the process.

<!-- gh-comment-id:2427600569 --> @Ranjithdss15 commented on GitHub (Oct 21, 2024): For Mac: Close the ollama process from the Activity Monitor and avoid confirming it again with ollama ps from your terminal, as this will restart the process.
Author
Owner

@kiran-bsv commented on GitHub (Nov 14, 2024):

If ollama is managed by systemd, you can stop and disable it with these commands:

sudo systemctl stop ollama
sudo systemctl disable ollama

This worked for me

<!-- gh-comment-id:2475294121 --> @kiran-bsv commented on GitHub (Nov 14, 2024): If ollama is managed by systemd, you can stop and disable it with these commands: ``` sudo systemctl stop ollama sudo systemctl disable ollama ``` This worked for me
Author
Owner

@alaeddingurel commented on GitHub (Dec 3, 2024):

If ollama is managed by systemd, you can stop and disable it with these commands:

sudo systemctl stop ollama
sudo systemctl disable ollama

This worked for me

Thank you! This worked for me too!

<!-- gh-comment-id:2514226293 --> @alaeddingurel commented on GitHub (Dec 3, 2024): > If ollama is managed by systemd, you can stop and disable it with these commands: > > ``` > sudo systemctl stop ollama > sudo systemctl disable ollama > ``` > > This worked for me Thank you! This worked for me too!
Author
Owner

@tranvanthai commented on GitHub (Dec 26, 2024):

It worked with me
sudo pkill -9 ollama Ollama

<!-- gh-comment-id:2562309211 --> @tranvanthai commented on GitHub (Dec 26, 2024): It worked with me `sudo pkill -9 ollama Ollama`
Author
Owner

@webdev23 commented on GitHub (Jan 9, 2025):

Without all the mess, simply closing the HTTP connection client-side is what you are looking for.

<!-- gh-comment-id:2579189731 --> @webdev23 commented on GitHub (Jan 9, 2025): Without all the mess, simply closing the HTTP connection client-side is what you are looking for.
Author
Owner

@shayneoneill commented on GitHub (Jan 28, 2025):

Has there been any movement on this bug? Its been months now, and my current solution of uninstalling it or rebooting the computer is incredibly onerous.

Does anyone know how to stop it on a mac without it restarting? And why in the green hells IS it restarting. That seems awfully like a very illconcieved deliberate design decision.

<!-- gh-comment-id:2619538009 --> @shayneoneill commented on GitHub (Jan 28, 2025): Has there been any movement on this bug? Its been months now, and my current solution of uninstalling it or rebooting the computer is incredibly onerous. Does *anyone* know how to stop it on a mac without it restarting? And why in the green hells IS it restarting. That seems awfully like a very illconcieved deliberate design decision.
Author
Owner

@godefroi commented on GitHub (Jan 28, 2025):

@shayneoneill it does indeed seem to be a deliberate design decision, and not one that the development team seems interested in changing.

<!-- gh-comment-id:2619612694 --> @godefroi commented on GitHub (Jan 28, 2025): @shayneoneill it does indeed seem to be a deliberate design decision, and not one that the development team seems interested in changing.
Author
Owner

@chenshaoju commented on GitHub (Jan 29, 2025):

Why can't we elegantly stop ollama? For example: ollama.exe stopserve

<!-- gh-comment-id:2621000131 --> @chenshaoju commented on GitHub (Jan 29, 2025): Why can't we elegantly stop ollama? For example: `ollama.exe stopserve`
Author
Owner

@tobsecret commented on GitHub (Jan 30, 2025):

@shayneoneill you can go to the activity manager, sort by name and select and close all processes with ollama in the name.
There is an ollama helper process which I would guess is what restarts the main ollama process in case it crashes but it seems it might also restart it if you deliberately kill the ollama process.

<!-- gh-comment-id:2625324369 --> @tobsecret commented on GitHub (Jan 30, 2025): @shayneoneill you can go to the activity manager, sort by name and select and close all processes with ollama in the name. There is an ollama helper process which I would guess is what restarts the main ollama process in case it crashes but it seems it might also restart it if you deliberately kill the ollama process.
Author
Owner

@webdev23 commented on GitHub (Jan 31, 2025):

In Linux, without sudo, eiher wait 5 minutes to free up the VRAM, or:

curl -s http://localhost:11434/api/generate -d '{"model": "'$(ollama ps | tail -n1 | cut -d ' ' -f1 )'", "keep_alive": 0}'

You should get:
.... "done_reason":"unload"

<!-- gh-comment-id:2627702710 --> @webdev23 commented on GitHub (Jan 31, 2025): In Linux, without sudo, eiher wait 5 minutes to free up the VRAM, or: `curl -s http://localhost:11434/api/generate -d '{"model": "'$(ollama ps | tail -n1 | cut -d ' ' -f1 )'", "keep_alive": 0}'` You should get: `.... "done_reason":"unload"`
Author
Owner

@lamberta commented on GitHub (Feb 7, 2025):

Does anyone know how to stop it on a mac without it restarting?

I just hit this using when running the ollama-darwin.tgz binary (with no menu icon) ...
The following worked for me on macos (from comment above): sudo pkill -9 ollama Ollama

<!-- gh-comment-id:2641771096 --> @lamberta commented on GitHub (Feb 7, 2025): > Does _anyone_ know how to stop it on a mac without it restarting? I just hit this using when running the ollama-darwin.tgz binary (with no menu icon) ... The following worked for me on macos (from comment above): `sudo pkill -9 ollama Ollama`
Author
Owner

@abdrhxyii commented on GitHub (Feb 7, 2025):

I used windows, I manage to fix it by manually opening the task manager with ctr + shift + esc command, then i found out already there is an alama running, right clicked it and end the task, then in the powershell. I restarted the olama. and now it works

<!-- gh-comment-id:2642458629 --> @abdrhxyii commented on GitHub (Feb 7, 2025): I used windows, I manage to fix it by manually opening the task manager with ctr + shift + esc command, then i found out already there is an alama running, right clicked it and end the task, then in the powershell. I restarted the olama. and now it works
Author
Owner

@tyagi-py commented on GitHub (Mar 16, 2025):

Spent few minutes struggling, I figured out if you installed it through homebrew you need to run brew services stop ollama to stop it, else it'll respawn itself for no reason.

<!-- gh-comment-id:2727593354 --> @tyagi-py commented on GitHub (Mar 16, 2025): Spent few minutes struggling, I figured out if you installed it through homebrew you need to run `brew services stop ollama` to stop it, else it'll respawn itself for no reason.
Author
Owner

@ramonjd commented on GitHub (May 8, 2025):

I figured out if you installed it through homebrew you need to run brew services stop ollama to stop it, else it'll respawn itself for no reason.

Thank you! This was driving me crazy.

<!-- gh-comment-id:2860961486 --> @ramonjd commented on GitHub (May 8, 2025): > I figured out if you installed it through homebrew you need to run brew services stop ollama to stop it, else it'll respawn itself for no reason. Thank you! This was driving me crazy.
Author
Owner

@ultimatedirty commented on GitHub (Jul 20, 2025):

In my case I wanted to increase the context length through an environment variable, so i did this

pkill ollama; OLLAMA_CONTEXT_LENGTH=32768 ollama serve

<!-- gh-comment-id:3094631751 --> @ultimatedirty commented on GitHub (Jul 20, 2025): In my case I wanted to increase the context length through an environment variable, so i did this > pkill ollama; OLLAMA_CONTEXT_LENGTH=32768 ollama serve
Author
Owner

@austindd commented on GitHub (Aug 3, 2025):

Weirdly, running pkill ollama on MacOS did kill the process, but it just automatically restarts in a new process. Clicking the toolbar icon and quitting from the dropdown menu worked as expected.

I think I understand the reasoning behind this behavior... It's convenient for other programs to not worry about how to restart Ollama when it fails for whatever reason, so auto-restart makes sense for that use case. But it's a very unintuitive behavior. Most professional programmers would just assume pkill ollama or kill -9 <pid> would just work.

I would suggest adding a command to the Ollama CLI tool to force it to tear itself down without restarting. Maybe ollama destroy or something like that?

<!-- gh-comment-id:3147425445 --> @austindd commented on GitHub (Aug 3, 2025): Weirdly, running `pkill ollama` on MacOS did kill the process, but it just automatically restarts in a new process. Clicking the toolbar icon and quitting from the dropdown menu worked as expected. I think I understand the reasoning behind this behavior... It's convenient for other programs to not worry about how to restart Ollama when it fails for whatever reason, so auto-restart makes sense for that use case. But it's a very unintuitive behavior. Most professional programmers would just assume `pkill ollama` or `kill -9 <pid>` would just work. I would suggest adding a command to the Ollama CLI tool to force it to tear itself down without restarting. Maybe `ollama destroy` or something like that?
Author
Owner

@manualsh commented on GitHub (Aug 23, 2025):

To exit from Ollama and stop it on Windows, you have these options:

  1. If you are running Ollama in a command prompt or terminal, use Ctrl + C to stop the running server or model.

  2. If Ollama is running as a background process or service, open Task Manager (Ctrl + Shift + Esc), find the processes named "ollama" or similar, and manually end those tasks to stop Ollama.

  3. In PowerShell, you can stop all Ollama processes with this command:

Get-Process | Where-Object {$_.ProcessName -like '*ollama*'} | Stop-Process

  1. When Ollama is running, there is usually an icon in the system tray. Right-clicking that icon gives you an option to "Exit Ollama" to stop it completely.

  2. If you are inside an Ollama model session in the command line, typing the command /bye will exit the model and Ollama session.

These methods should help you properly exit and stop Ollama on Windows, whether from command line, system tray, or Task Manager.

<!-- gh-comment-id:3217452162 --> @manualsh commented on GitHub (Aug 23, 2025): To exit from Ollama and stop it on Windows, you have these options: 1. If you are running Ollama in a command prompt or terminal, use Ctrl + C to stop the running server or model. 2. If Ollama is running as a background process or service, open Task Manager (Ctrl + Shift + Esc), find the processes named "ollama" or similar, and manually end those tasks to stop Ollama. 3. In PowerShell, you can stop all Ollama processes with this command: `Get-Process | Where-Object {$_.ProcessName -like '*ollama*'} | Stop-Process ` 4. When Ollama is running, there is usually an icon in the system tray. Right-clicking that icon gives you an option to "Exit Ollama" to stop it completely. 5. If you are inside an Ollama model session in the command line, typing the command /bye will exit the model and Ollama session. These methods should help you properly exit and stop Ollama on Windows, whether from command line, system tray, or Task Manager.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#315