[GH-ISSUE #647] Read-only filesystem support #287

Closed
opened 2026-04-12 09:49:43 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @swthorn on GitHub (Sep 29, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/647

Hi,

I ran the installation command described in the README:
curl https://ollama.ai/install.sh | sh
on NixOS.

However, the binaries and systemd service are not installed correctly. Is it possible to install this on a read-only file system? Or, can we install this in a local directory rather than /usr/bin?

This is the entire output:

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  7391    0  7391    0     0  28304      0 --:--:-- --:--:-- --:--:-- 28209
>>> Downloading ollama...
######################################################################## 100.0%##O=#  #                                                                      
>>> Installing ollama to /usr/bin...
[sudo] password for swthorn: 
>>> Creating ollama user...
useradd: Warning: missing or non-executable shell '/bin/false'
>>> Creating ollama systemd service...
tee: /etc/systemd/system/ollama.service: Read-only file system
>>> Install complete. Run "ollama" from the command line.
❯ ollama
ollama: command not found
❯ zsh
❯ ollama
ollama: command not found
Originally created by @swthorn on GitHub (Sep 29, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/647 Hi, I ran the installation command described in the README: `curl https://ollama.ai/install.sh | sh` on NixOS. However, the binaries and systemd service are not installed correctly. Is it possible to install this on a read-only file system? Or, can we install this in a local directory rather than /usr/bin? This is the entire output: ```❯ curl https://ollama.ai/install.sh | sh % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 7391 0 7391 0 0 28304 0 --:--:-- --:--:-- --:--:-- 28209 >>> Downloading ollama... ######################################################################## 100.0%##O=# # >>> Installing ollama to /usr/bin... [sudo] password for swthorn: >>> Creating ollama user... useradd: Warning: missing or non-executable shell '/bin/false' >>> Creating ollama systemd service... tee: /etc/systemd/system/ollama.service: Read-only file system >>> Install complete. Run "ollama" from the command line. ❯ ollama ollama: command not found ❯ zsh ❯ ollama ollama: command not found
GiteaMirror added the feature request label 2026-04-12 09:49:43 -05:00
Author
Owner

@Dinour commented on GitHub (Oct 7, 2023):

Hey @swthorn

did u find a solution for this ? i'm in the same situation with the same problem ?

<!-- gh-comment-id:1751781841 --> @Dinour commented on GitHub (Oct 7, 2023): Hey @swthorn did u find a solution for this ? i'm in the same situation with the same problem ?
Author
Owner

@BruceMacD commented on GitHub (Oct 27, 2023):

Hey y'all, installing Ollama on a read-only file-system will only work but only as a client.

When running ollama serve Ollama will create logs, ssh keys (for pushing models to ollama.ai) and download modelfiles to the filesystem.

Running other commands from the CLI should be ok in a read-only file system as far as I'm aware, you could connect to an external server like this: OLLAMA_HOST=123.456.789 ollama run mistral

<!-- gh-comment-id:1783361147 --> @BruceMacD commented on GitHub (Oct 27, 2023): Hey y'all, installing Ollama on a read-only file-system will only work but only as a client. When running `ollama serve` Ollama will create logs, ssh keys (for pushing models to ollama.ai) and download modelfiles to the filesystem. Running other commands from the CLI should be ok in a read-only file system as far as I'm aware, you could connect to an external server like this: `OLLAMA_HOST=123.456.789 ollama run mistral`
Author
Owner

@Diti commented on GitHub (Dec 27, 2023):

Paraphrasing @BruceMacD, Ollama is already present in Nixpkgs. It doesn’t exist as a service (yet), but you can still simply install the ollama package (see man configuration.nix), or use nix-shell (feel free not to use nohup):

nix-shell --packages ollama --run 'nohup ollama serve' &
nix-shell --packages ollama --run 'ollama pull mistral'
nix-shell --packages ollama --command 'ollama run mistral'
# >>> Send a message (/? for help)

I have tried running Ollama as a service, but it complains about needing a $HOME, probably for storing the models. As far as I understand, the models will NEED to be downloaded beforehand (with their proper sha256 checksum, see nix-prefetch-url --unpack), and be placed where Ollama expects them to be.

<!-- gh-comment-id:1870634853 --> @Diti commented on GitHub (Dec 27, 2023): Paraphrasing @BruceMacD, [Ollama is already present in Nixpkgs](/NixOS/nixpkgs/blob/nixos-23.11/pkgs/tools/misc/ollama/default.nix). It doesn’t exist as a service (yet), but you can still simply install the `ollama` package (see `man configuration.nix`), or use `nix-shell` (feel free not to use [nohup](https://en.wikipedia.org/wiki/Nohup)): ```sh nix-shell --packages ollama --run 'nohup ollama serve' & nix-shell --packages ollama --run 'ollama pull mistral' nix-shell --packages ollama --command 'ollama run mistral' # >>> Send a message (/? for help) ``` I have tried running Ollama as a service, but it complains about needing a `$HOME`, probably for storing the models. As far as I understand, the models [will NEED to be downloaded beforehand](https://zero-to-nix.com/concepts/hermeticity) (with their proper `sha256` checksum, see `nix-prefetch-url --unpack`), and be placed where Ollama expects them to be.
Author
Owner

@mxyng commented on GitHub (Jan 16, 2024):

Closing this since Ollama requires some form of write in order to download and run models

<!-- gh-comment-id:1894609792 --> @mxyng commented on GitHub (Jan 16, 2024): Closing this since Ollama requires some form of write in order to download and run models
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#287