[GH-ISSUE #8341] [feature] start ollama automatically on startup #5347

Closed
opened 2026-04-12 16:32:52 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @remco-pc on GitHub (Jan 7, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8341

i've been playing with this feature to automatically start ollama serve from startup (docker php init) but it won't start with an & (background process). then i tried to put it in a script with a lock file in cron and sees if that start the script. it starts my script but it then does not start ollama serve which should !
running it in bash works like a charm, but automating this is currently a pain in the ...

Originally created by @remco-pc on GitHub (Jan 7, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8341 i've been playing with this feature to automatically start ollama serve from startup (docker php init) but it won't start with an & (background process). then i tried to put it in a script with a lock file in cron and sees if that start the script. it starts my script but it then does not start `ollama serve` which should ! running it in bash works like a charm, but automating this is currently a pain in the ...
GiteaMirror added the feature request label 2026-04-12 16:32:52 -05:00
Author
Owner

@dhiltgen commented on GitHub (Jul 4, 2025):

I'm not sure exactly what you're trying to accomplish, but there are existing patterns that start Ollama automatically.

Based on your "docker php init" I'm assuming a Docker based solution, so typically this would involve setting up multiple containers and networking so they can communicate with each other. You can set this up manually, or via Docker Compose, Kubernetes, etc. From within your PHP container, you would access the Ollama server running in the other container by it's local name.

<!-- gh-comment-id:3037248226 --> @dhiltgen commented on GitHub (Jul 4, 2025): I'm not sure exactly what you're trying to accomplish, but there are existing patterns that start Ollama automatically. - For a full Linux OS, our [install.sh ](https://github.com/ollama/ollama/blob/main/scripts/install.sh)wires up systemd to start it as a system service. - For containers, we have a Docker image which handles running the service - see https://github.com/ollama/ollama/blob/main/docs/docker.md Based on your "docker php init" I'm assuming a Docker based solution, so typically this would involve setting up multiple containers and networking so they can communicate with each other. You can set this up manually, or via [Docker Compose](https://docs.docker.com/compose/), Kubernetes, etc. From within your PHP container, you would access the Ollama server running in the other container by it's local name.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5347