[GH-ISSUE #10118] Maybe propose podman as it works well with GPU too #6638

Open
opened 2026-04-12 18:19:15 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @metal3d on GitHub (Apr 3, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10118

I know that Docker is massively used, but Red Hat, Fedora, and many other distributions propose Podman. It provides approximatively the same approach to start ollama and a very nice systemd integration named "Quadlet".

To start ollama with Podman, following the same method to install nvidia-container-kit, we can then launch it in a rootless container:

podman run -d -it --gpus all --security-opts=label=disable -p 11434:11434 docker.io/ollama/ollama

The very nice part is that we can create one of these files:

  • ~/config/containers/systemd/ollama.container for normal user, rootless
  • /etc/container/systemd/ollama.container for root system integration

containing this:

[Container]
ContainerName = "ollama"
Image = "docker.io/ollama/ollama:latest"
PublishPort = "11434:11434"
SecurityLabelDisable=true
Volume = /opt/models:/opt/models:Z
Environment=OLLAMA_MODELS=/opt/models
AddDevice=nvidia.com/gpu=all
AutoUpdate=registry


[Service]
Restart=always

[Install]
WantedBy=default.target

Then, systemctl daemon-reload and systemctl start ollama.
Or for normal users systemctl --user daemon-reload and systemctl --user start ollama

Now, we have ollama in a container, starting with the system, automatically updated, with models in a separated volume, and so on.

Maybe we can add this in the documentation, I think that Podman deserves more visibility as it is installed by default in Rocky, Alma, Fedora, Red Hat...

Originally created by @metal3d on GitHub (Apr 3, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10118 I know that Docker is massively used, but Red Hat, Fedora, and many other distributions propose Podman. It provides approximatively the same approach to start ollama and a very nice systemd integration named "Quadlet". To start ollama with Podman, following the same method to install nvidia-container-kit, we can then launch it in a rootless container: ```bash podman run -d -it --gpus all --security-opts=label=disable -p 11434:11434 docker.io/ollama/ollama ``` The very nice part is that we can create one of these files: - `~/config/containers/systemd/ollama.container` for normal user, rootless - `/etc/container/systemd/ollama.container` for root system integration containing this: ```ini [Container] ContainerName = "ollama" Image = "docker.io/ollama/ollama:latest" PublishPort = "11434:11434" SecurityLabelDisable=true Volume = /opt/models:/opt/models:Z Environment=OLLAMA_MODELS=/opt/models AddDevice=nvidia.com/gpu=all AutoUpdate=registry [Service] Restart=always [Install] WantedBy=default.target ``` Then, `systemctl daemon-reload` and `systemctl start ollama`. Or for normal users `systemctl --user daemon-reload` and `systemctl --user start ollama` Now, we have ollama in a container, starting with the system, automatically updated, with models in a separated volume, and so on. Maybe we can add this in the documentation, I think that Podman deserves more visibility as it is installed by default in Rocky, Alma, Fedora, Red Hat...
GiteaMirror added the feature request label 2026-04-12 18:19:15 -05:00
Author
Owner

@jbtrystram commented on GitHub (Jun 3, 2025):

You should check ramalama https://github.com/containers/ramalama

<!-- gh-comment-id:2935168391 --> @jbtrystram commented on GitHub (Jun 3, 2025): You should check ramalama https://github.com/containers/ramalama
Author
Owner

@dikkedimi commented on GitHub (Dec 30, 2025):

this is recommended for bazzite (no way to install docker manually), but it broke and now I'm having trouble finding the guide I followed. I tried installing searxng, but i moved my computer to another room and it stopped working. i have no clue so now I'm trying to see what I might have changed.

<!-- gh-comment-id:3700671478 --> @dikkedimi commented on GitHub (Dec 30, 2025): this is recommended for bazzite (no way to install docker manually), but it broke and now I'm having trouble finding the guide I followed. I tried installing searxng, but i moved my computer to another room and it stopped working. i have no clue so now I'm trying to see what I might have changed.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6638