mirror of
https://github.com/ollama/ollama.git
synced 2026-05-06 16:11:34 -05:00
Closed
opened 2026-05-03 08:21:57 -05:00 by GiteaMirror
·
63 comments
No Branch/Tag Specified
main
dhiltgen/ci
parth-launch-plan-gating
hoyyeva/anthropic-reference-images-path
parth-anthropic-reference-images-path
brucemacd/download-before-remove
hoyyeva/editor-config-repair
parth-mlx-decode-checkpoints
parth-launch-codex-app
hoyyeva/fix-codex-model-metadata-warning
hoyyeva/qwen
parth/hide-claude-desktop-till-release
hoyyeva/opencode-image-modality
parth-add-claude-code-autoinstall
release_v0.22.0
pdevine/manifest-list
codex/fix-codex-model-metadata-warning
pdevine/addressable-manifest
brucemacd/launch-fetch-reccomended
jmorganca/llama-compat
launch-copilot-cli
hoyyeva/opencode-thinking
release_v0.20.7
parth-auto-save-backup
parth-test
jmorganca/gemma4-audio-replacements
fix-manifest-digest-on-pull
hoyyeva/vscode-improve
brucemacd/install-server-wait
parth/update-claude-docs
brucemac/start-ap-install
pdevine/mlx-update
pdevine/qwen35_vision
drifkin/api-show-fallback
mintlify/image-generation-1773352582
hoyyeva/server-context-length-local-config
jmorganca/faster-reptition-penalties
jmorganca/convert-nemotron
parth-pi-thinking
pdevine/sampling-penalties
jmorganca/fix-create-quantization-memory
dongchen/resumable_transfer_fix
pdevine/sampling-cache-error
jessegross/mlx-usage
hoyyeva/openclaw-config
hoyyeva/app-html
pdevine/qwen3next
brucemacd/sign-sh-install
brucemacd/tui-update
brucemacd/usage-api
jmorganca/launch-empty
fix-app-dist-embed
mxyng/mlx-compile
mxyng/mlx-quant
mxyng/mlx-glm4.7
mxyng/mlx
brucemacd/simplify-model-picker
jmorganca/qwen3-concurrent
fix-glm-4.7-flash-mla-config
drifkin/qwen3-coder-opening-tag
brucemacd/usage-cli
fix-cuda12-fattn-shmem
ollama-imagegen-docs
parth/fix-multiline-inputs
brucemacd/config-docs
mxyng/model-files
mxyng/simple-execute
fix-imagegen-ollama-models
mxyng/async-upload
jmorganca/lazy-no-dtype-changes
imagegen-auto-detect-create
parth/decrease-concurrent-download-hf
fix-mlx-quantize-init
jmorganca/x-cleanup
usage
imagegen-readme
jmorganca/glm-image
mlx-gpu-cd
jmorganca/imagegen-modelfile
parth/agent-skills
parth/agent-allowlist
parth/signed-in-offline
parth/agents
parth/fix-context-chopping
improve-cloud-flow
parth/add-models-websearch
parth/prompt-renderer-mcp
jmorganca/native-settings
jmorganca/download-stream-hash
jmorganca/client2-rebased
brucemacd/oai-chat-req-multipart
jessegross/multi_chunk_reserve
grace/additional-omit-empty
grace/mistral-3-large
mxyng/tokenizer2
mxyng/tokenizer
jessegross/flash
hoyyeva/windows-nacked-app
mxyng/cleanup-attention
grace/deepseek-parser
hoyyeva/remember-unsent-prompt
parth/add-lfs-pointer-error-conversion
parth/olmo2-test2
hoyyeva/ollama-launchagent-plist
nicole/olmo-model
parth/olmo-test
mxyng/remove-embedded
parth/render-template
jmorganca/intellect-3
parth/remove-prealloc-linter
jmorganca/cmd-eval
nicole/nomic-embed-text-fix
mxyng/lint-2
hoyyeva/add-gemini-3-pro-preview
hoyyeva/load-model-list
mxyng/expand-path
mxyng/environ-2
hoyyeva/deeplink-json-encoding
parth/improve-tool-calling-tests
hoyyeva/conversation
hoyyeva/assistant-edit-response
hoyyeva/thinking
origin/brucemacd/invalid-char-i-err
parth/improve-tool-calling
jmorganca/required-omitempty
grace/qwen3-vl-tests
mxyng/iter-client
parth/docs-readme
nicole/embed-test
pdevine/integration-benchstat
parth/remove-generate-cmd
parth/add-toolcall-id
mxyng/server-tests
jmorganca/glm-4.6
jmorganca/gin-h-compat
drifkin/stable-tool-args
pdevine/qwen3-more-thinking
parth/add-websearch-client
nicole/websearch_local
jmorganca/qwen3-coder-updates
grace/deepseek-v3-migration-tests
mxyng/fix-create
jmorganca/cloud-errors
pdevine/parser-tidy
revert-12233-parth/simplify-entrypoints-runner
parth/enable-so-gpt-oss
brucemacd/qwen3vl
jmorganca/readme-simplify
parth/gpt-oss-structured-outputs
revert-12039-jmorganca/tools-braces
mxyng/embeddings
mxyng/gguf
mxyng/benchmark
mxyng/types-null
parth/move-parsing
mxyng/gemma2
jmorganca/docs
mxyng/16-bit
mxyng/create-stdin
pdevine/authorizedkeys
mxyng/quant
parth/opt-in-error-context-window
brucemacd/cache-models
brucemacd/runner-completion
jmorganca/llama-update-6
brucemacd/benchmark-list
brucemacd/partial-read-caps
parth/deepseek-r1-tools
mxyng/omit-array
parth/tool-prefix-temp
brucemacd/runner-test
jmorganca/qwen25vl
brucemacd/model-forward-test-ext
parth/python-function-parsing
jmorganca/cuda-compression-none
drifkin/num-parallel
drifkin/chat-truncation-fix
jmorganca/sync
parth/python-tools-calling
drifkin/array-head-count
brucemacd/create-no-loop
parth/server-enable-content-stream-with-tools
qwen25omni
mxyng/v3
brucemacd/ropeconfig
jmorganca/silence-tokenizer
parth/sample-so-test
parth/sampling-structured-outputs
brucemacd/doc-go-engine
parth/constrained-sampling-json
jmorganca/mistral-wip
brucemacd/mistral-small-convert
parth/sample-unmarshal-json-for-params
brucemacd/jomorganca/mistral
pdevine/bfloat16
jmorganca/mistral
brucemacd/mistral
pdevine/logging
parth/sample-correctness-fix
parth/sample-fix-sorting
jmorgan/sample-fix-sorting-extras
jmorganca/temp-0-images
brucemacd/parallel-embed-models
brucemacd/shim-grammar
jmorganca/fix-gguf-error
bmizerany/nameswork
jmorganca/faster-releases
bmizerany/validatenames
brucemacd/err-no-vocab
brucemacd/rope-config
brucemacd/err-hint
brucemacd/qwen2_5
brucemacd/logprobs
brucemacd/new_runner_graph_bench
progress-flicker
brucemacd/forward-test
brucemacd/go_qwen2
pdevine/gemma2
jmorganca/add-missing-symlink-eval
mxyng/next-debug
parth/set-context-size-openai
brucemacd/next-bpe-bench
brucemacd/next-bpe-test
brucemacd/new_runner_e2e
brucemacd/new_runner_qwen2
pdevine/convert-cohere2
brucemacd/convert-cli
parth/log-probs
mxyng/next-mlx
mxyng/cmd-history
parth/templating
parth/tokenize-detokenize
brucemacd/check-key-register
bmizerany/grammar
jmorganca/vendor-081b29bd
mxyng/func-checks
jmorganca/fix-null-format
parth/fix-default-to-warn-json
jmorganca/qwen2vl
jmorganca/no-concat
parth/cmd-cleanup-SO
brucemacd/check-key-register-structured-err
parth/openai-stream-usage
parth/fix-referencing-so
stream-tools-stop
jmorganca/degin-1
brucemacd/install-path-clean
brucemacd/push-name-validation
brucemacd/browser-key-register
jmorganca/openai-fix-first-message
jmorganca/fix-proxy
jessegross/sample
parth/disallow-streaming-tools
dhiltgen/remove_submodule
jmorganca/ga
jmorganca/mllama
pdevine/newlines
pdevine/geems-2b
jmorganca/llama-bump
mxyng/modelname-7
mxyng/gin-slog
mxyng/modelname-6
jyan/convert-prog
jyan/quant5
paligemma-support
pdevine/import-docs
jmorganca/openai-context
jyan/paligemma
jyan/p2
jyan/palitest
bmizerany/embedspeedup
jmorganca/llama-vit
brucemacd/allow-ollama
royh/ep-methods
royh/whisper
mxyng/api-models
mxyng/fix-memory
jyan/q4_4/8
jyan/ollama-v
royh/stream-tools
roy-embed-parallel
bmizerany/hrm
revert-5963-revert-5924-mxyng/llama3.1-rope
royh/embed-viz
jyan/local2
jyan/auth
jyan/local
jyan/parse-temp
jmorganca/template-mistral
jyan/reord-g
royh-openai-suffixdocs
royh-imgembed
royh-embed-parallel
jyan/quant4
royh-precision
jyan/progress
pdevine/fix-template
jyan/quant3
pdevine/ggla
mxyng/update-registry-domain
jmorganca/ggml-static
mxyng/create-context
jyan/v0.146
mxyng/layers-from-files
build_dist
bmizerany/noseek
royh-ls
royh-name
timeout
mxyng/server-timestamp
bmizerany/nosillyggufslurps
royh-params
jmorganca/llama-cpp-7c26775
royh-openai-delete
royh-show-rigid
jmorganca/enable-fa
jmorganca/no-error-template
jyan/format
royh-testdelete
bmizerany/fastverify
language_support
pdevine/ps-glitches
brucemacd/tokenize
bruce/iq-quants
bmizerany/filepathwithcoloninhost
mxyng/split-bin
bmizerany/client-registry
jmorganca/if-none-match
native
jmorganca/native
jmorganca/batch-embeddings
jmorganca/initcmake
jmorganca/mm
pdevine/showggmlinfo
modenameenforcealphanum
bmizerany/modenameenforcealphanum
jmorganca/done-reason
jmorganca/llama-cpp-8960fe8
ollama.com
bmizerany/filepathnobuild
bmizerany/types/model/defaultfix
rmdisplaylong
nogogen
bmizerany/x
modelfile-readme
bmizerany/replacecolon
jmorganca/limit
jmorganca/execstack
jmorganca/replace-assets
mxyng/tune-concurrency
jmorganca/testing
whitespace-detection
jmorganca/options
upgrade-all
scratch
cuda-search
mattw/airenamer
mattw/allmodelsonhuggingface
mattw/quantcontext
mattw/whatneedstorun
brucemacd/llama-mem-calc
mattw/faq-context
mattw/communitylinks
mattw/noprune
mattw/python-functioncalling
rename
mxyng/install
pulse
remove-first
editor
mattw/selfqueryingretrieval
cgo
mattw/howtoquant
api
matt/streamingapi
format-config
mxyng/extra-args
shell
update-nous-hermes
cp-model
upload-progress
fix-unknown-model
fix-model-names
delete-fix
insecure-registry
ls
deletemodels
progressbar
readme-updates
license-layers
skip-list
list-models
modelpath
matt/examplemodelfiles
distribution
go-opts
v0.23.1
v0.23.1-rc0
v0.23.0
v0.23.0-rc0
v0.22.1
v0.22.1-rc1
v0.22.1-rc0
v0.22.0
v0.22.0-rc1
v0.21.3-rc0
v0.21.2-rc1
v0.21.2
v0.21.2-rc0
v0.21.1
v0.21.1-rc1
v0.21.1-rc0
v0.21.0
v0.21.0-rc1
v0.21.0-rc0
v0.20.8-rc0
v0.20.7
v0.20.7-rc1
v0.20.7-rc0
v0.20.6
v0.20.6-rc1
v0.20.6-rc0
v0.20.5
v0.20.5-rc2
v0.20.5-rc1
v0.20.5-rc0
v0.20.4
v0.20.4-rc2
v0.20.4-rc1
v0.20.4-rc0
v0.20.3
v0.20.3-rc0
v0.20.2
v0.20.1
v0.20.1-rc2
v0.20.1-rc1
v0.20.1-rc0
v0.20.0
v0.20.0-rc1
v0.20.0-rc0
v0.19.0
v0.19.0-rc2
v0.19.0-rc1
v0.19.0-rc0
v0.18.4-rc1
v0.18.4-rc0
v0.18.3
v0.18.3-rc2
v0.18.3-rc1
v0.18.3-rc0
v0.18.2
v0.18.2-rc1
v0.18.2-rc0
v0.18.1
v0.18.1-rc1
v0.18.1-rc0
v0.18.0
v0.18.0-rc2
v0.18.0-rc1
v0.18.0-rc0
v0.17.8-rc4
v0.17.8-rc3
v0.17.8-rc2
v0.17.8-rc1
v0.17.8-rc0
v0.17.7
v0.17.7-rc2
v0.17.7-rc1
v0.17.7-rc0
v0.17.6
v0.17.5
v0.17.4
v0.17.3
v0.17.2
v0.17.1
v0.17.1-rc2
v0.17.1-rc1
v0.17.1-rc0
v0.17.0
v0.17.0-rc2
v0.17.0-rc1
v0.17.0-rc0
v0.16.3
v0.16.3-rc2
v0.16.3-rc1
v0.16.3-rc0
v0.16.2
v0.16.2-rc0
v0.16.1
v0.16.0
v0.16.0-rc2
v0.16.0-rc0
v0.16.0-rc1
v0.15.6
v0.15.5
v0.15.5-rc5
v0.15.5-rc4
v0.15.5-rc3
v0.15.5-rc2
v0.15.5-rc1
v0.15.5-rc0
v0.15.4
v0.15.3
v0.15.2
v0.15.1
v0.15.1-rc1
v0.15.1-rc0
v0.15.0-rc6
v0.15.0
v0.15.0-rc5
v0.15.0-rc4
v0.15.0-rc3
v0.15.0-rc2
v0.15.0-rc1
v0.15.0-rc0
v0.14.3
v0.14.3-rc3
v0.14.3-rc2
v0.14.3-rc1
v0.14.3-rc0
v0.14.2
v0.14.2-rc1
v0.14.2-rc0
v0.14.1
v0.14.0-rc11
v0.14.0
v0.14.0-rc10
v0.14.0-rc9
v0.14.0-rc8
v0.14.0-rc7
v0.14.0-rc6
v0.14.0-rc5
v0.14.0-rc4
v0.14.0-rc3
v0.14.0-rc2
v0.14.0-rc1
v0.14.0-rc0
v0.13.5
v0.13.5-rc1
v0.13.5-rc0
v0.13.4-rc2
v0.13.4
v0.13.4-rc1
v0.13.4-rc0
v0.13.3
v0.13.3-rc1
v0.13.3-rc0
v0.13.2
v0.13.2-rc2
v0.13.2-rc1
v0.13.2-rc0
v0.13.1
v0.13.1-rc2
v0.13.1-rc1
v0.13.1-rc0
v0.13.0
v0.13.0-rc0
v0.12.11
v0.12.11-rc1
v0.12.11-rc0
v0.12.10
v0.12.10-rc1
v0.12.10-rc0
v0.12.9-rc0
v0.12.9
v0.12.8
v0.12.8-rc0
v0.12.7
v0.12.7-rc1
v0.12.7-rc0
v0.12.7-citest0
v0.12.6
v0.12.6-rc1
v0.12.6-rc0
v0.12.5
v0.12.5-rc0
v0.12.4
v0.12.4-rc7
v0.12.4-rc6
v0.12.4-rc5
v0.12.4-rc4
v0.12.4-rc3
v0.12.4-rc2
v0.12.4-rc1
v0.12.4-rc0
v0.12.3
v0.12.2
v0.12.2-rc0
v0.12.1
v0.12.1-rc1
v0.12.1-rc2
v0.12.1-rc0
v0.12.0
v0.12.0-rc1
v0.12.0-rc0
v0.11.11
v0.11.11-rc3
v0.11.11-rc2
v0.11.11-rc1
v0.11.11-rc0
v0.11.10
v0.11.9
v0.11.9-rc0
v0.11.8
v0.11.8-rc0
v0.11.7-rc1
v0.11.7-rc0
v0.11.7
v0.11.6
v0.11.6-rc0
v0.11.5-rc4
v0.11.5-rc3
v0.11.5
v0.11.5-rc5
v0.11.5-rc2
v0.11.5-rc1
v0.11.5-rc0
v0.11.4
v0.11.4-rc0
v0.11.3
v0.11.3-rc0
v0.11.2
v0.11.1
v0.11.0-rc0
v0.11.0-rc1
v0.11.0-rc2
v0.11.0
v0.10.2-int1
v0.10.1
v0.10.0
v0.10.0-rc4
v0.10.0-rc3
v0.10.0-rc2
v0.10.0-rc1
v0.10.0-rc0
v0.9.7-rc1
v0.9.7-rc0
v0.9.6
v0.9.6-rc0
v0.9.6-ci0
v0.9.5
v0.9.4-rc5
v0.9.4-rc6
v0.9.4
v0.9.4-rc3
v0.9.4-rc4
v0.9.4-rc1
v0.9.4-rc2
v0.9.4-rc0
v0.9.3
v0.9.3-rc5
v0.9.4-citest0
v0.9.3-rc4
v0.9.3-rc3
v0.9.3-rc2
v0.9.3-rc1
v0.9.3-rc0
v0.9.2
v0.9.1
v0.9.1-rc1
v0.9.1-rc0
v0.9.1-ci1
v0.9.1-ci0
v0.9.0
v0.9.0-rc0
v0.8.0
v0.8.0-rc0
v0.7.1-rc2
v0.7.1
v0.7.1-rc1
v0.7.1-rc0
v0.7.0
v0.7.0-rc1
v0.7.0-rc0
v0.6.9-rc0
v0.6.8
v0.6.8-rc0
v0.6.7
v0.6.7-rc2
v0.6.7-rc1
v0.6.7-rc0
v0.6.6
v0.6.6-rc2
v0.6.6-rc1
v0.6.6-rc0
v0.6.5-rc1
v0.6.5
v0.6.5-rc0
v0.6.4-rc0
v0.6.4
v0.6.3-rc1
v0.6.3
v0.6.3-rc0
v0.6.2
v0.6.2-rc0
v0.6.1
v0.6.1-rc0
v0.6.0-rc0
v0.6.0
v0.5.14-rc0
v0.5.13
v0.5.13-rc6
v0.5.13-rc5
v0.5.13-rc4
v0.5.13-rc3
v0.5.13-rc2
v0.5.13-rc1
v0.5.13-rc0
v0.5.12
v0.5.12-rc1
v0.5.12-rc0
v0.5.11
v0.5.10
v0.5.9
v0.5.9-rc0
v0.5.8-rc13
v0.5.8
v0.5.8-rc12
v0.5.8-rc11
v0.5.8-rc10
v0.5.8-rc9
v0.5.8-rc8
v0.5.8-rc7
v0.5.8-rc6
v0.5.8-rc5
v0.5.8-rc4
v0.5.8-rc3
v0.5.8-rc2
v0.5.8-rc1
v0.5.8-rc0
v0.5.7
v0.5.6
v0.5.5
v0.5.5-rc0
v0.5.4
v0.5.3
v0.5.3-rc0
v0.5.2
v0.5.2-rc3
v0.5.2-rc2
v0.5.2-rc1
v0.5.2-rc0
v0.5.1
v0.5.0
v0.5.0-rc1
v0.4.8-rc0
v0.4.7
v0.4.6
v0.4.5
v0.4.4
v0.4.3
v0.4.3-rc0
v0.4.2
v0.4.2-rc1
v0.4.2-rc0
v0.4.1
v0.4.1-rc0
v0.4.0
v0.4.0-rc8
v0.4.0-rc7
v0.4.0-rc6
v0.4.0-rc5
v0.4.0-rc4
v0.4.0-rc3
v0.4.0-rc2
v0.4.0-rc1
v0.4.0-rc0
v0.4.0-ci3
v0.3.14
v0.3.14-rc0
v0.3.13
v0.3.12
v0.3.12-rc5
v0.3.12-rc4
v0.3.12-rc3
v0.3.12-rc2
v0.3.12-rc1
v0.3.11
v0.3.11-rc4
v0.3.11-rc3
v0.3.11-rc2
v0.3.11-rc1
v0.3.10
v0.3.10-rc1
v0.3.9
v0.3.8
v0.3.7
v0.3.7-rc6
v0.3.7-rc5
v0.3.7-rc4
v0.3.7-rc3
v0.3.7-rc2
v0.3.7-rc1
v0.3.6
v0.3.5
v0.3.4
v0.3.3
v0.3.2
v0.3.1
v0.3.0
v0.2.8
v0.2.8-rc2
v0.2.8-rc1
v0.2.7
v0.2.6
v0.2.5
v0.2.4
v0.2.3
v0.2.2
v0.2.2-rc2
v0.2.2-rc1
v0.2.1
v0.2.0
v0.1.49-rc14
v0.1.49-rc13
v0.1.49-rc12
v0.1.49-rc11
v0.1.49-rc10
v0.1.49-rc9
v0.1.49-rc8
v0.1.49-rc7
v0.1.49-rc6
v0.1.49-rc4
v0.1.49-rc5
v0.1.49-rc3
v0.1.49-rc2
v0.1.49-rc1
v0.1.48
v0.1.47
v0.1.46
v0.1.45-rc5
v0.1.45
v0.1.45-rc4
v0.1.45-rc3
v0.1.45-rc2
v0.1.45-rc1
v0.1.44
v0.1.43
v0.1.42
v0.1.41
v0.1.40
v0.1.40-rc1
v0.1.39
v0.1.39-rc2
v0.1.39-rc1
v0.1.38
v0.1.37
v0.1.36
v0.1.35
v0.1.35-rc1
v0.1.34
v0.1.34-rc1
v0.1.33
v0.1.33-rc7
v0.1.33-rc6
v0.1.33-rc5
v0.1.33-rc4
v0.1.33-rc3
v0.1.33-rc2
v0.1.33-rc1
v0.1.32
v0.1.32-rc2
v0.1.32-rc1
v0.1.31
v0.1.30
v0.1.29
v0.1.28
v0.1.27
v0.1.26
v0.1.25
v0.1.24
v0.1.23
v0.1.22
v0.1.21
v0.1.20
v0.1.19
v0.1.18
v0.1.17
v0.1.16
v0.1.15
v0.1.14
v0.1.13
v0.1.12
v0.1.11
v0.1.10
v0.1.9
v0.1.8
v0.1.7
v0.1.6
v0.1.5
v0.1.4
v0.1.3
v0.1.2
v0.1.1
v0.1.0
v0.0.21
v0.0.20
v0.0.19
v0.0.18
v0.0.17
v0.0.16
v0.0.15
v0.0.14
v0.0.13
v0.0.12
v0.0.11
v0.0.10
v0.0.9
v0.0.8
v0.0.7
v0.0.6
v0.0.5
v0.0.4
v0.0.3
v0.0.2
v0.0.1
Labels
Clear labels
amd
api
app
bug
build
cli
cloud
compatibility
context-length
create
docker
documentation
embeddings
feature request
feedback wanted
good first issue
gpt-oss
gpu
harmony
help wanted
image
install
intel
js
launch
linux
macos
memory
mlx
model
needs more info
networking
nvidia
ollama.com
performance
pull-request
python
question
registry
rendering
thinking
tools
top
vulkan
windows
wsl
Mirrored from GitHub Pull Request
No Label
Milestone
No items
No Milestone
Projects
Clear projects
No project
No Assignees
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: github-starred/ollama#62352
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @mora-phi on GitHub (Oct 3, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/690
Hi,
How can I stop Ollama ?
If I run with "ollama run llama2" for instance and then quit with "Ctrl-C", then go to http://127.0.0.1:11434/ in a browser, it shows "Ollama is running"
When I kill the running process with a kill -9, a new process is instantly spawned.
Therefore I don't know how to totally stop Ollama...
(I'm on macos)
@BruceMacD commented on GitHub (Oct 3, 2023):
Are you using the Ollama Mac app? If so just exiting the toolbar app will stop the server. The Mac app will restart the server also, if left open.
Here:

Otherwise, in a terminal:
or to stop the Mac App:
@mora-phi commented on GitHub (Oct 4, 2023):
Thanks a lot, I didn't check in the upper toolbar, my bad.
Indeed, closing properly from there stopped spawning new processes.
Thanks again :-)
@dongyuwei commented on GitHub (Oct 16, 2023):
The bug not fixed, can't kill it on Mac.
ollama version 0.1.1
@BruceMacD commented on GitHub (Oct 16, 2023):
@dongyuwei have you exited the mac app from the toolbar?
@dongyuwei commented on GitHub (Oct 17, 2023):
Yes. @BruceMacD
I logout my system then re-login, no more ollma.
@jwandekoken commented on GitHub (Oct 18, 2023):
This is happening on Linux too. After I issue the command
ollama run model, and after I close the terminal withctrl + D, the ollama instance keeps running. If I kill it, it just respawn.Edit: in my case, even after restarting the system, the program keeps re-opening
@BruceMacD commented on GitHub (Oct 18, 2023):
@jwandekoken On Linux Ollama is running on as a systemd service. You can stop it using
systemctl.@devhims commented on GitHub (Nov 2, 2023):
Thanks! This worked on MacOS.
@JermellB commented on GitHub (Nov 4, 2023):
sudo service ollama stopworked for me on ubuntu@devwaseem commented on GitHub (Nov 26, 2023):
For Mac:
pkill ollamaworks@airtonix commented on GitHub (Dec 12, 2023):
Should this not be made obvious by an abstraction?
@shayneoneill commented on GitHub (Jan 11, 2024):
This really is not a great behavior. Thanks to the notch on the M1/M2 laptops, the llama icon gets obscured so its not obvious at all its running nor how to access that toolbar item. Is there some sort of config option to just forbid the system from putting things there and using command line start/stop instead? Can I just remove the service entirely?
pkill ollama does NOT solve the problem btw as it somewhat disobediently just restarts it.
Also I have no idea what systemctl is, nor does my mac :(
@tmceld commented on GitHub (Jan 12, 2024):
also having this problem on mac - its def. running, but i don't have anything in the taskbar and pgrep/kill just causes a restart
@marhar commented on GitHub (Feb 7, 2024):
I've been having similar problems on my mac... server is running, I don't see anything on my taskbar to stop it, and I can't install the new ollama since the process is running. pkill/pgrep etc as per other users, killing the process just restarts it.
I worked around by
rm -rf /Applications/Ollama; pkill ollamawhich caused an error. Following that, ollama process was not running, and I could install the new Ollama.app into /Applications.@MehrCurry commented on GitHub (Feb 15, 2024):
It has to be
rm -rf /Applications/Ollama.app; pkill ollama@AnirudhaGohokar commented on GitHub (Feb 16, 2024):
How do i stop on windows
@keebOo commented on GitHub (Feb 19, 2024):
In the Mac terminal, I am attempting to check if there is an active service using the command:
lsof -i :11434This is to verify if anything is running on the ollama standard port.
Following suggestions from other users, I then execute also:
pgrep ollamaAfter this, if the
lsofcommand shows any process, I use thekillcommand followed by the PID to terminate the service (pgrep show eventually all the ollama process)@HazemElAgaty commented on GitHub (Mar 3, 2024):
I was only able to kill the process using the activity monitor.
@skytodmoon commented on GitHub (Mar 5, 2024):
I use 'systemctl stop ollama.service' to stop the ollama
But when I use 'systemctl start ollama.service', the default port is bind again.
How to stop the service really?
@techkanna commented on GitHub (Mar 14, 2024):
How do i stop on windows??
@sohaibsoussi commented on GitHub (Mar 14, 2024):
Use this commande in your powershell :
Get-Process | Where-Object {$_.ProcessName -like '*ollama*'} | Stop-Process@hpsaturn commented on GitHub (Apr 7, 2024):
In my Debian I can reproduce it. Nothing works:
kill,service stoporsystemctlfails trying to stop the service, and also produce conflicts with myamdgpudriver: #3527@tamashalasi commented on GitHub (Apr 14, 2024):
Not a good practice, but
pkill -9 ollamaworks for now on Arch Linux (probably other distros as well). Haven't experienced any file corruption so far, but be careful with it.@darkBuddha commented on GitHub (Apr 17, 2024):
same on Debian 12
@hpsaturn commented on GitHub (Apr 19, 2024):
re-open the issue? maybe yes..
@gros-chat commented on GitHub (Apr 20, 2024):
it worked for me (Mac)
@lselector commented on GitHub (Apr 20, 2024):
This works on Mac:
sudo pkill -9 ollama OllamaThen double-click on Ollama in Applications to start it
@lzhhh93 commented on GitHub (May 1, 2024):
@
On my side, this command seems to only kill the ollama server but does not release RAM. RAM needs to be released manually in task manager.
@alby13 commented on GitHub (May 2, 2024):
We used systemctl and we noticed that ollama was running in the background.
We ran this command to stop the process and disable the auto-starting of the ollama server, and we can restart it manually at anytime.
To start it manually, we use this command: sudo systemctl start ollama.service
However, we noticed that once we restarted the ollama.service and then reboot the machine, the process gets added to the auto-start again.
So what we did was we stop the process, and then disable it every time. This prevents it from automatically starting when Linux is started. The commands are:
sudo systemctl stop ollama.service
sudo systemctl disable ollama.service
Thank you for the original information in your post.
@p1r473 commented on GitHub (May 3, 2024):
Add an execstop for your systemd file as there isnt one in the documentation
Ive also gone ahead and added another customization
@joliss commented on GitHub (May 26, 2024):
On Mac, this problem seems to be fixed as of a few releases ago (currently on 0.1.38). 👍 Quitting the Ollama app in the menu bar, or alternatively running
killall Ollama ollama, reliably kills the Ollama process now, and it doesn't respawn.@Eliyahou commented on GitHub (May 27, 2024):
in Windows the command is
Get-Process | Where-Object {$_.ProcessName -like 'ollama'} | kill
@godefroi commented on GitHub (Jun 6, 2024):
@lzhhh93 what does this even mean? A process, when stopped, does not consume any memory, and "task manager" does not have any functionality for "manually" "releasing" memory.
@pawliczka commented on GitHub (Jun 10, 2024):
I have the same problem. When you TerminateProcess ollama.exe on Windows ollama_llama_server.exe is not terminated.
@Skarian commented on GitHub (Jun 13, 2024):
I am having this exact same issue. Am able to end ollama.exe but the runners stay running and using RAM seemingly perpetually. Im using the CLI version of ollama on Windows.
@kleer001 commented on GitHub (Jul 15, 2024):
I too would appreciate the syntatic sugar of an 'ollama server_stop' command.
@Vimiso commented on GitHub (Jul 23, 2024):
I found on mac
pkill ollamadoesn't work because the main process actually starts with a capital.pkill Ollamashould do it.@dkgaraujo commented on GitHub (Jul 29, 2024):
Fresh install on an ubuntu. Killing it does not work for me. In fact, even killing it by PID is hard because it spawns a new process (with a new PID = old PID + 2) every second or so. I'm disappointed in this behaviour. I would even understand that they want to create a persistent server experience but this is not (to my knowledge) good practice to do this, and neither is the absence of a syntactic sugar to stop it properly. Relatedly, I don't know why this issue is closed.
@cedricferry commented on GitHub (Jul 29, 2024):
if you use Homebrew:
@AledHe commented on GitHub (Aug 17, 2024):
Very useful method, for an auto script:
Please sudo first to run the bash : ))
@flashlan commented on GitHub (Aug 19, 2024):
What works for me on Linux is:
@MrBns commented on GitHub (Aug 20, 2024):
why so many drama to stop Ollama. why not just ollama stop model/name. and stopped that.
@gonzalezea commented on GitHub (Aug 27, 2024):
This works for me in W10
taskkill /fi "imagename eq ollama app.exe"@Mustafanaji0413 commented on GitHub (Sep 4, 2024):
Thus works for mac! THANKS!!
@gvirus21 commented on GitHub (Sep 10, 2024):
Facing this problem for 30 minutes, Solved it by just force-killing Ollam from Mac's activity monitor.
@Ranjithdss15 commented on GitHub (Oct 21, 2024):
For Mac:
Close the ollama process from the Activity Monitor and avoid confirming it again with ollama ps from your terminal, as this will restart the process.
@kiran-bsv commented on GitHub (Nov 14, 2024):
If ollama is managed by systemd, you can stop and disable it with these commands:
This worked for me
@alaeddingurel commented on GitHub (Dec 3, 2024):
Thank you! This worked for me too!
@tranvanthai commented on GitHub (Dec 26, 2024):
It worked with me
sudo pkill -9 ollama Ollama@webdev23 commented on GitHub (Jan 9, 2025):
Without all the mess, simply closing the HTTP connection client-side is what you are looking for.
@shayneoneill commented on GitHub (Jan 28, 2025):
Has there been any movement on this bug? Its been months now, and my current solution of uninstalling it or rebooting the computer is incredibly onerous.
Does anyone know how to stop it on a mac without it restarting? And why in the green hells IS it restarting. That seems awfully like a very illconcieved deliberate design decision.
@godefroi commented on GitHub (Jan 28, 2025):
@shayneoneill it does indeed seem to be a deliberate design decision, and not one that the development team seems interested in changing.
@chenshaoju commented on GitHub (Jan 29, 2025):
Why can't we elegantly stop ollama? For example:
ollama.exe stopserve@tobsecret commented on GitHub (Jan 30, 2025):
@shayneoneill you can go to the activity manager, sort by name and select and close all processes with ollama in the name.
There is an ollama helper process which I would guess is what restarts the main ollama process in case it crashes but it seems it might also restart it if you deliberately kill the ollama process.
@webdev23 commented on GitHub (Jan 31, 2025):
In Linux, without sudo, eiher wait 5 minutes to free up the VRAM, or:
curl -s http://localhost:11434/api/generate -d '{"model": "'$(ollama ps | tail -n1 | cut -d ' ' -f1 )'", "keep_alive": 0}'You should get:
.... "done_reason":"unload"@lamberta commented on GitHub (Feb 7, 2025):
I just hit this using when running the ollama-darwin.tgz binary (with no menu icon) ...
The following worked for me on macos (from comment above):
sudo pkill -9 ollama Ollama@abdrhxyii commented on GitHub (Feb 7, 2025):
I used windows, I manage to fix it by manually opening the task manager with ctr + shift + esc command, then i found out already there is an alama running, right clicked it and end the task, then in the powershell. I restarted the olama. and now it works
@tyagi-py commented on GitHub (Mar 16, 2025):
Spent few minutes struggling, I figured out if you installed it through homebrew you need to run
brew services stop ollamato stop it, else it'll respawn itself for no reason.@ramonjd commented on GitHub (May 8, 2025):
Thank you! This was driving me crazy.
@ultimatedirty commented on GitHub (Jul 20, 2025):
In my case I wanted to increase the context length through an environment variable, so i did this
@austindd commented on GitHub (Aug 3, 2025):
Weirdly, running
pkill ollamaon MacOS did kill the process, but it just automatically restarts in a new process. Clicking the toolbar icon and quitting from the dropdown menu worked as expected.I think I understand the reasoning behind this behavior... It's convenient for other programs to not worry about how to restart Ollama when it fails for whatever reason, so auto-restart makes sense for that use case. But it's a very unintuitive behavior. Most professional programmers would just assume
pkill ollamaorkill -9 <pid>would just work.I would suggest adding a command to the Ollama CLI tool to force it to tear itself down without restarting. Maybe
ollama destroyor something like that?@manualsh commented on GitHub (Aug 23, 2025):
To exit from Ollama and stop it on Windows, you have these options:
If you are running Ollama in a command prompt or terminal, use Ctrl + C to stop the running server or model.
If Ollama is running as a background process or service, open Task Manager (Ctrl + Shift + Esc), find the processes named "ollama" or similar, and manually end those tasks to stop Ollama.
In PowerShell, you can stop all Ollama processes with this command:
Get-Process | Where-Object {$_.ProcessName -like '*ollama*'} | Stop-ProcessWhen Ollama is running, there is usually an icon in the system tray. Right-clicking that icon gives you an option to "Exit Ollama" to stop it completely.
If you are inside an Ollama model session in the command line, typing the command /bye will exit the model and Ollama session.
These methods should help you properly exit and stop Ollama on Windows, whether from command line, system tray, or Task Manager.
@SinghAayushi commented on GitHub (Apr 20, 2026):
Using below approach worked for me: openclaw gateway stop
I did following thing to debug:
lsof -i:18789 - To find the process id
ps -j -p 1718 - With the process id, I got process id of parent
ps -p 1 - Then when I checked parent process id, I saw that it's running from /sbin/launchd
From /sbin/launchd it's automatically picked and started