[PR #3615] Install Ollama on OSTree systems #11219

Open
opened 2026-04-12 23:24:36 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/3615
Author: @ericcurtin
Created: 4/12/2024
Status: 🔄 Open

Base: mainHead: ollama-ostree


📝 Commits (1)

  • f9d2061 Install Ollama on OSTree systems

📊 Changes

2 files changed (+29 additions, -5 deletions)

View changed files

📝 gpu/amd_linux.go (+15 -0)
📝 scripts/install.sh (+14 -5)

📄 Description

There's a large plethora of OSTree OSes in the Fedora family:

Silverblue, Kinoite, CoreOS, IoT, Onyx, Sericea, Vauxite

In the CentOS Stream family:

Automotive Stream Distribution, CoreOS

In the Red Hat family:

Red Hat In-Vehicle Operating System, Red Hat Enterprise Linux CoreOS, RHEL for Edge

Then there's the Universal Blue family:

Things like podman-machine and podman-desktop on Windows and macOS use Fedora CoreOS as the host OS so there is that also.

The list goes on and on.

These OSes are ideal for containerized AI LLMs like Ollama.

I eventually got this working with podman, which is probably the best route to use on these OSes (or an rpm when someone packages it).

This change gets the install script working.

/usr/bin and /usr/share aren't writable on these systems, but /usr/local/bin and /usr/local/share are.

So this change ensures if /usr/local/bin is used during installation, that we also use /usr/local/share, if /usr/local/share exists. And then everything seems to work fine on these OSes for non-containerized installs.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/3615 **Author:** [@ericcurtin](https://github.com/ericcurtin) **Created:** 4/12/2024 **Status:** 🔄 Open **Base:** `main` ← **Head:** `ollama-ostree` --- ### 📝 Commits (1) - [`f9d2061`](https://github.com/ollama/ollama/commit/f9d20611dc896679a6071e3e44d573c3db309351) Install Ollama on OSTree systems ### 📊 Changes **2 files changed** (+29 additions, -5 deletions) <details> <summary>View changed files</summary> 📝 `gpu/amd_linux.go` (+15 -0) 📝 `scripts/install.sh` (+14 -5) </details> ### 📄 Description There's a large plethora of OSTree OSes in the Fedora family: Silverblue, Kinoite, CoreOS, IoT, Onyx, Sericea, Vauxite In the CentOS Stream family: Automotive Stream Distribution, CoreOS In the Red Hat family: Red Hat In-Vehicle Operating System, Red Hat Enterprise Linux CoreOS, RHEL for Edge Then there's the Universal Blue family: Things like podman-machine and podman-desktop on Windows and macOS use Fedora CoreOS as the host OS so there is that also. The list goes on and on. These OSes are ideal for containerized AI LLMs like Ollama. I eventually got this working with podman, which is probably the best route to use on these OSes (or an rpm when someone packages it). This change gets the install script working. /usr/bin and /usr/share aren't writable on these systems, but /usr/local/bin and /usr/local/share are. So this change ensures if /usr/local/bin is used during installation, that we also use /usr/local/share, if /usr/local/share exists. And then everything seems to work fine on these OSes for non-containerized installs. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-12 23:24:36 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#11219