[GH-ISSUE #9285] How to automatically start the Ollama serve On MacOS #68109

Closed
opened 2026-05-04 12:33:42 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @super617 on GitHub (Feb 22, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9285

How to automatically start the Ollama serve service on Mac OS at boot without using the command line.

Originally created by @super617 on GitHub (Feb 22, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9285 How to automatically start the Ollama serve service on Mac OS at boot without using the command line.
GiteaMirror added the question label 2026-05-04 12:33:42 -05:00
Author
Owner

@gospreema commented on GitHub (May 8, 2025):

Ask ChatGPT how to do this. It isn't a function of Ollama to start itself, the OS is responsible to start apps and processes when booting. ChatGPT knows of 2 ways

<!-- gh-comment-id:2861784274 --> @gospreema commented on GitHub (May 8, 2025): Ask ChatGPT how to do this. It isn't a function of Ollama to start itself, the OS is responsible to start apps and processes when booting. ChatGPT knows of 2 ways
Author
Owner

@dhiltgen commented on GitHub (Jul 4, 2025):

If you use the official Ollama Mac App installed via https://ollama.com/download/Ollama.dmg it will automatically start Ollama upon login.

If you want more control, we publish an ollama-darwin.tgz file in every release on github https://github.com/ollama/ollama/releases which contains the underlying binaries and omits the desktop app. You can set up your own system service using launchd to run an ollama serve with your preferred environment variables to configure the service.

<!-- gh-comment-id:3037299903 --> @dhiltgen commented on GitHub (Jul 4, 2025): If you use the official Ollama Mac App installed via https://ollama.com/download/Ollama.dmg it will automatically start Ollama upon login. If you want more control, we publish an `ollama-darwin.tgz` file in every release on github https://github.com/ollama/ollama/releases which contains the underlying binaries and omits the desktop app. You can set up your own system service using launchd to run an `ollama serve` with your preferred environment variables to configure the service.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68109