[GH-ISSUE #1018] How to keep ollama running in a docker container #26258

Closed
opened 2026-04-22 02:24:58 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @shahriyardx on GitHub (Nov 6, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1018

I am using docker to run ollama locally, but each time I have to do

docker exec -it ollama ollama run llama2

I don't want that, I want it to keep running and use the api it exposes

docker-compose.yml

version: '3.9'

services:
  ollama:
    container_name: ollama
    image: ollama/ollama
    ports:
      - 11434:11434
    volumes:
      - ./data:/root/.ollama
Originally created by @shahriyardx on GitHub (Nov 6, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1018 I am using docker to run ollama locally, but each time I have to do ``` docker exec -it ollama ollama run llama2 ``` I don't want that, I want it to keep running and use the api it exposes docker-compose.yml ```yml version: '3.9' services: ollama: container_name: ollama image: ollama/ollama ports: - 11434:11434 volumes: - ./data:/root/.ollama ```
Author
Owner

@pepperoni21 commented on GitHub (Nov 6, 2023):

I think you can run ollama serve instead

<!-- gh-comment-id:1795469410 --> @pepperoni21 commented on GitHub (Nov 6, 2023): I think you can run `ollama serve` instead
Author
Owner

@shahriyardx commented on GitHub (Nov 6, 2023):

I think you can run ollama serve instead

Can you give example of a docker-compose?

<!-- gh-comment-id:1795485153 --> @shahriyardx commented on GitHub (Nov 6, 2023): > I think you can run `ollama serve` instead Can you give example of a docker-compose?
Author
Owner

@pepperoni21 commented on GitHub (Nov 6, 2023):

I think you can run ollama serve instead

Can you give example of a docker-compose?

Actually just running the docker image as you're doing starts the server, so you should be able to access the api since you exposed the port.

<!-- gh-comment-id:1795732251 --> @pepperoni21 commented on GitHub (Nov 6, 2023): > > I think you can run `ollama serve` instead > > Can you give example of a docker-compose? Actually just running the docker image as you're doing starts the server, so you should be able to access the api since you exposed the port.
Author
Owner

@mxyng commented on GitHub (Nov 6, 2023):

docker exec -it ollama ollama run llama2

This command enters an running docker container and runs llama2. The server is already running and exposing the port on 0.0.0.0:11434. You can check with docker ps -f name=ollama

<!-- gh-comment-id:1795736650 --> @mxyng commented on GitHub (Nov 6, 2023): > docker exec -it ollama ollama run llama2 This command enters an running docker container and runs llama2. The server is already running and exposing the port on 0.0.0.0:11434. You can check with `docker ps -f name=ollama`
Author
Owner

@KasperGroesLudvigsen commented on GitHub (Feb 2, 2024):

I think the container shuts down because it does not recognize Ollama as a process. A hacky way to keep the container running could be to start a process in the container that doesn't really have any effect like:

tail -f /dev/null

<!-- gh-comment-id:1924222657 --> @KasperGroesLudvigsen commented on GitHub (Feb 2, 2024): I think the container shuts down because it does not recognize Ollama as a process. A hacky way to keep the container running could be to start a process in the container that doesn't really have any effect like: `tail -f /dev/null`
Author
Owner

@KasperGroesLudvigsen commented on GitHub (Feb 5, 2024):

I also realized that doing this is the Dockerfile will keep the container running:

COPY ./.ollama .

ENTRYPOINT ["/bin/ollama"]
CMD ["serve"]
<!-- gh-comment-id:1927544182 --> @KasperGroesLudvigsen commented on GitHub (Feb 5, 2024): I also realized that doing this is the Dockerfile will keep the container running: ``` COPY ./.ollama . ENTRYPOINT ["/bin/ollama"] CMD ["serve"] ```
Author
Owner

@enachedaniel19 commented on GitHub (Feb 7, 2024):

Did you find the solution for this?
I need to keep ollama running on a docker container(ollama run llama2:13b) and I can't manage to do so.

<!-- gh-comment-id:1931493743 --> @enachedaniel19 commented on GitHub (Feb 7, 2024): Did you find the solution for this? I need to keep ollama running on a docker container(ollama run llama2:13b) and I can't manage to do so.
Author
Owner

@KasperGroesLudvigsen commented on GitHub (Feb 7, 2024):

My suggestion wrt the Dockerfile works for me

<!-- gh-comment-id:1931741619 --> @KasperGroesLudvigsen commented on GitHub (Feb 7, 2024): My suggestion wrt the Dockerfile works for me
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26258