[GH-ISSUE #3314] "server stop" and "server status" commands #48549

Open
opened 2026-04-28 08:48:41 -05:00 by GiteaMirror · 17 comments
Owner

Originally created by @FilkerZero on GitHub (Mar 23, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3314

What are you trying to do?

I would like to have more control over the ollama server. As it stands, the ollama command line does not provide a convenient way to stop the server once it is running or get the server status. Having a way to get memory, runtime, GPU and CPU statistics would also be a plus.

How should we solve this?

I suggest adding either new commands or flags to the serve command; some examples follow, but it's the functionality, not the particular syntax (option flags vs. commands) I care about:

  • ollama serve --status - Print server status (running/not running) and perhaps the loaded model and API URL
  • ollama serve --stop - Stop the server if it is running
  • ollama stop - Alias for ollama serve --stop
  • ollama unload - Unload the model from memory but leave the server running
  • ollama stats - Display server memory, runtime, and other statistics (eg, number of connected clients (max, current))

What is the impact of not solving this?

It will remain hard for the average home user to control the ollama server process and determine the resources in use by it.

Anything else?

This appears to be a near duplicate of #3182

Originally created by @FilkerZero on GitHub (Mar 23, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3314 ### What are you trying to do? I would like to have more control over the ollama server. As it stands, the ollama command line does not provide a convenient way to stop the server once it is running or get the server status. Having a way to get memory, runtime, GPU and CPU statistics would also be a plus. ### How should we solve this? I suggest adding either new commands or flags to the `serve` command; some examples follow, but it's the functionality, not the particular syntax (option flags vs. commands) I care about: - `ollama serve --status` - Print server status (running/not running) and perhaps the loaded model and API URL - `ollama serve --stop` - Stop the server if it is running - `ollama stop` - Alias for `ollama serve --stop` - `ollama unload` - Unload the model from memory but leave the server running - `ollama stats` - Display server memory, runtime, and other statistics (eg, number of connected clients (max, current)) ### What is the impact of not solving this? It will remain hard for the average home user to control the ollama server process and determine the resources in use by it. ### Anything else? This appears to be a near duplicate of #3182
GiteaMirror added the feature request label 2026-04-28 08:48:41 -05:00
Author
Owner

@OPDEV001 commented on GitHub (Mar 24, 2024):

Same needed, support!!!

<!-- gh-comment-id:2016652322 --> @OPDEV001 commented on GitHub (Mar 24, 2024): Same needed, support!!!
Author
Owner

@spencercjh commented on GitHub (Mar 27, 2024):

I'd like to resolve it. If you assign it to me, I will refer to helm dashboard to implement the feature.

<!-- gh-comment-id:2022236171 --> @spencercjh commented on GitHub (Mar 27, 2024): I'd like to resolve it. If you assign it to me, I will refer to [helm dashboard](https://github.com/komodorio/helm-dashboard) to implement the feature.
Author
Owner

@yale8848 commented on GitHub (Mar 27, 2024):

Hope add this feature.
Thank you.

<!-- gh-comment-id:2022347134 --> @yale8848 commented on GitHub (Mar 27, 2024): Hope add this feature. Thank you.
Author
Owner

@oldgithubman commented on GitHub (Mar 30, 2024):

A GUI would be even nicer

<!-- gh-comment-id:2027906100 --> @oldgithubman commented on GitHub (Mar 30, 2024): A GUI would be even nicer
Author
Owner

@lonngxiang commented on GitHub (Apr 12, 2024):

same

<!-- gh-comment-id:2050831631 --> @lonngxiang commented on GitHub (Apr 12, 2024): same
Author
Owner

@cpoptic commented on GitHub (Jun 24, 2024):

Any update on this? 3 months this issue has been open. It's reasonable to expect a server to able to be stopped without having to resort to pkill commands

Being able to ollama serve --stop should be a top, top priority for this project.

<!-- gh-comment-id:2187304630 --> @cpoptic commented on GitHub (Jun 24, 2024): Any update on this? 3 months this issue has been open. It's reasonable to expect a server to able to be stopped without having to resort to `pkill` commands Being able to `ollama serve --stop` should be a top, top priority for this project.
Author
Owner

@oldgithubman commented on GitHub (Jun 25, 2024):

Any update on this? 3 months this issue has been open. It's reasonable to expect a server to able to be stopped without having to resort to pkill commands

Being able to ollama serve --stop should be a top, top priority for this project.

I just switched to using llama.cpp directly (ollama is basically just a wrapper for llama.cpp). CTRL-C is all you need

<!-- gh-comment-id:2187670198 --> @oldgithubman commented on GitHub (Jun 25, 2024): > Any update on this? 3 months this issue has been open. It's reasonable to expect a server to able to be stopped without having to resort to `pkill` commands > > Being able to `ollama serve --stop` should be a top, top priority for this project. I just switched to using llama.cpp directly (ollama is basically just a wrapper for llama.cpp). CTRL-C is all you need
Author
Owner

@cpoptic commented on GitHub (Jun 25, 2024):

Oh thank you, @oldmanjk, i had been reading about llama.cpp and that sounds so much simpler to stop a server than the current way using ollama. Thank you!

<!-- gh-comment-id:2187715960 --> @cpoptic commented on GitHub (Jun 25, 2024): Oh thank you, @oldmanjk, i had been reading about llama.cpp and that sounds so much simpler to stop a server than the current way using ollama. Thank you!
Author
Owner

@Kroy22 commented on GitHub (Aug 19, 2024):

on my system ( win11) there's an icon in the systemtray after I start the ollama server.
right-click will give you the option to exit ollama.
refreshing the local host (http://localhost:11434/) will verify the server has stopped.

<!-- gh-comment-id:2295496248 --> @Kroy22 commented on GitHub (Aug 19, 2024): on my system ( win11) there's an **icon in the systemtray** after I start the ollama server. **right-click** will give you the option to **exit ollama.** refreshing the local host (http://localhost:11434/) will verify the server has **stopped.**
Author
Owner

@kitarp29 commented on GitHub (Sep 12, 2024):

@jmorganca I can work on these commands. I have been trying to run Ollama in a container.,These commands could be really handy!

<!-- gh-comment-id:2346598432 --> @kitarp29 commented on GitHub (Sep 12, 2024): @jmorganca I can work on these commands. I have been trying to run Ollama in a container.,These commands could be really handy!
Author
Owner

@pdevine commented on GitHub (Sep 18, 2024):

Just an update:

  • ollama serve --stop you can do with your OS, so no plans to add that
  • ollama unload is now implemented and will be in 0.3.11 as ollama stop (the antonym of ollama run)
  • ollama stats is an interesting idea
<!-- gh-comment-id:2357305256 --> @pdevine commented on GitHub (Sep 18, 2024): Just an update: * `ollama serve --stop` you can do with your OS, so no plans to add that * `ollama unload` is now implemented and will be in `0.3.11` as `ollama stop` (the antonym of `ollama run`) * `ollama stats` is an interesting idea
Author
Owner

@jmitek commented on GitHub (Sep 18, 2024):

I'm now using ollama serve, then just ctrl+c when I don't need it. Does the job for me.

<!-- gh-comment-id:2357342684 --> @jmitek commented on GitHub (Sep 18, 2024): I'm now using ollama serve, then just ctrl+c when I don't need it. Does the job for me.
Author
Owner

@kitarp29 commented on GitHub (Sep 18, 2024):

Well in case when you use it in containers, ctrl+c is not always an option.
@pdevine I can work on the stats feature if you like

<!-- gh-comment-id:2357681545 --> @kitarp29 commented on GitHub (Sep 18, 2024): Well in case when you use it in containers, ctrl+c is not always an option. @pdevine I can work on the stats feature if you like
Author
Owner

@pdevine commented on GitHub (Sep 23, 2024):

@kitarp29 wouldn't you just use docker stop or docker kill?

<!-- gh-comment-id:2369222379 --> @pdevine commented on GitHub (Sep 23, 2024): @kitarp29 wouldn't you just use `docker stop` or `docker kill`?
Author
Owner

@dhiltgen commented on GitHub (Oct 23, 2024):

Some aspects of "status" are covered by #7262

<!-- gh-comment-id:2433030709 --> @dhiltgen commented on GitHub (Oct 23, 2024): Some aspects of "status" are covered by #7262
Author
Owner

@qs-arno commented on GitHub (Nov 20, 2024):

  • ollama serve --stop you can do with your OS, so no plans to add that

I mean, obviously you're not technically wrong, but it's currently a pretty manual process unless you happen to have the terminal where it's running in front of you. Does Ollama maintain a pidfile anywhere? I couldn't find one, but maybe I'm missing something.

Anyway, I've got some suggestions for people who are running into this and would like some ideas of how to improve things (to clarify, I still think having an ollama shutdown-style subcommand is the correct/best way, but in the absence of that...).

Eyeball it

This is the simplest way (requires no setup); if you're not doing this often then honestly it's probably best to just get used to this and move on with your life. 🙂

## Search for your "ollama serve" process
ps -ef | grep "ollama"

## Once you've found the PID by hand, send SIGINT
kill -INT 8675309

You could probably just do killall -INT ollama or similar but personally I feel like that kind of loose cannon behaviour always bites back eventually.

Use a shell script wrapper to generate a pidfile, and use that to send SIGINT (Ctrl-c)

#!/bin/sh

## Use this command to stop Ollama if it was started with this script (this
## is the same as typing Ctrl-c in the terminal where Ollama is running):
## kill -INT $(cat "${HOME}/.ollama/ollama.pid") && rm "${HOME}/.ollama/ollama.pid"
##
## You can use this first, to confirm that PID is still your Ollama service:
## ps $(cat "${HOME}/.ollama/ollama.pid")

## Put your usual start command line here
ollama serve &

echo ${!} > "${HOME}/.ollama/ollama.pid"

## EOF
########

Using a pidfile is generally fine, but technically you've gotta watch out for edge cases like "the process in question stopped and started again, but the pidfile never updated, and now some other process is using that PID".

Use a terminal multiplexer

What I would actually recommend is running Ollama inside screen, tmux, etc. That way you can reattach to the terminal it's using from anywhere and just hit Ctrl-c there, and not have to worry about tracking down PIDs or worrying about pidfile race conditions, etc.

<!-- gh-comment-id:2488823303 --> @qs-arno commented on GitHub (Nov 20, 2024): > * `ollama serve --stop` you can do with your OS, so no plans to add that I mean, obviously you're not _technically_ wrong, but it's currently a pretty manual process unless you happen to have the terminal where it's running in front of you. Does Ollama maintain a pidfile anywhere? I couldn't find one, but maybe I'm missing something. Anyway, I've got some suggestions for people who are running into this and would like some ideas of how to improve things (to clarify, I still think having an `ollama shutdown`-style subcommand is the correct/best way, but in the absence of that...). #### Eyeball it This is the simplest way (requires no setup); if you're not doing this often then honestly it's probably best to just get used to this and move on with your life. :slightly_smiling_face: ```sh ## Search for your "ollama serve" process ps -ef | grep "ollama" ## Once you've found the PID by hand, send SIGINT kill -INT 8675309 ``` You could probably just do `killall -INT ollama` or similar but personally I feel like that kind of loose cannon behaviour always bites back eventually. #### Use a shell script wrapper to generate a pidfile, and use that to send SIGINT (Ctrl-c) ```sh #!/bin/sh ## Use this command to stop Ollama if it was started with this script (this ## is the same as typing Ctrl-c in the terminal where Ollama is running): ## kill -INT $(cat "${HOME}/.ollama/ollama.pid") && rm "${HOME}/.ollama/ollama.pid" ## ## You can use this first, to confirm that PID is still your Ollama service: ## ps $(cat "${HOME}/.ollama/ollama.pid") ## Put your usual start command line here ollama serve & echo ${!} > "${HOME}/.ollama/ollama.pid" ## EOF ######## ``` Using a pidfile is generally fine, but technically you've gotta watch out for edge cases like "the process in question stopped and started again, but the pidfile never updated, and now some other process is using that PID". #### Use a terminal multiplexer What I would actually recommend is running Ollama inside screen, tmux, etc. That way you can reattach to the terminal it's using from anywhere and just hit Ctrl-c there, and not have to worry about tracking down PIDs or worrying about pidfile race conditions, etc.
Author
Owner

@ocontant commented on GitHub (Apr 16, 2025):

I would refer you to read about how daemon are being managed in Unix for 40+ years.
Those tools were manually coded for each daemon, but they ended up being decoupled in their own command over time:

<!-- gh-comment-id:2810295067 --> @ocontant commented on GitHub (Apr 16, 2025): I would refer you to read about how daemon are being managed in Unix for 40+ years. Those tools were manually coded for each daemon, but they ended up being decoupled in their own command over time: - https://www.commandlinux.com/man-page/man3/daemon.3.html - https://www.commandlinux.com/man-page/man3/pidfile.3.html - https://www.commandlinux.com/man-page/man3/flopen.3.html - https://www.commandlinux.com/man-page/man2/open.2.html
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48549