[GH-ISSUE #8266] Installation and data directory should be customizable by user #5285

Closed
opened 2026-04-12 16:27:51 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @gnusupport on GitHub (Dec 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/8266

What is the issue?

Too little hard disk, and I cannot install ollama files.

I should be able to say WHERE shall ollama be installed and WHERE the data files like models should be installed. And I do not like notion of installing it system wide, rather as user. People have various hard disks, I have many hard disks and mount points, I do not keep everything on /usr

Please consider this proposal.

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

No response

Originally created by @gnusupport on GitHub (Dec 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/8266 ### What is the issue? Too little hard disk, and I cannot install ollama files. I should be able to say WHERE shall ollama be installed and WHERE the data files like models should be installed. And I do not like notion of installing it system wide, rather as user. People have various hard disks, I have many hard disks and mount points, I do not keep everything on /usr Please consider this proposal. ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version _No response_
GiteaMirror added the question label 2026-04-12 16:27:51 -05:00
Author
Owner
<!-- gh-comment-id:2564844059 --> @rick-github commented on GitHub (Dec 29, 2024): https://github.com/ollama/ollama/blob/main/docs/windows.md#changing-install-location https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-set-them-to-a-different-location https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install
Author
Owner

@gnusupport commented on GitHub (Dec 29, 2024):

Thanks. I understand I can set $OLLAMA_MODELS to change data directory.

For ollama software, I am not on Windows, the above link may not be relevant for GNU/Linux operating system on how to set destination for software.

Is there way to set the destination directory for ollama software to some other location within my $HOME?

<!-- gh-comment-id:2564854651 --> @gnusupport commented on GitHub (Dec 29, 2024): Thanks. I understand I can set `$OLLAMA_MODELS` to change data directory. For ollama software, I am not on Windows, the above link may not be relevant for GNU/Linux operating system on how to set destination for software. Is there way to set the destination directory for ollama software to some other location within my `$HOME`?
Author
Owner

@rick-github commented on GitHub (Dec 29, 2024):

https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install

<!-- gh-comment-id:2564855120 --> @rick-github commented on GitHub (Dec 29, 2024): https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install
Author
Owner

@pdevine commented on GitHub (Dec 30, 2024):

I'll go ahead and close the issue. @gnusupport feel free to keep commenting if the FAQs/docs aren't enough.

<!-- gh-comment-id:2564880094 --> @pdevine commented on GitHub (Dec 30, 2024): I'll go ahead and close the issue. @gnusupport feel free to keep commenting if the FAQs/docs aren't enough.
Author
Owner

@gnusupport commented on GitHub (Dec 30, 2024):

I have installed it with

$ curl -C - -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
$ mkdir ollama; tar -C ollama -xzf ollama-linux-amd64.tgz

I can run serve, though ollama list gives no models. Is it supposed to give it?

<!-- gh-comment-id:2564898523 --> @gnusupport commented on GitHub (Dec 30, 2024): I have installed it with ``` $ curl -C - -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz $ mkdir ollama; tar -C ollama -xzf ollama-linux-amd64.tgz ``` I can run _serve_, though `ollama list` gives no models. Is it supposed to give it?
Author
Owner

@rick-github commented on GitHub (Dec 30, 2024):

You have to pull the models that you want to use. You can use ollama run <model> to both pull and run the model, or just ollama pull <model> to download the model. You can find available models at https://ollama.com/models. If you want to run a model from HuggingFace (that is suitably formatted), you can use ollama pull hf.co/<model>. You can also import models from different sources.

<!-- gh-comment-id:2564908661 --> @rick-github commented on GitHub (Dec 30, 2024): You have to pull the models that you want to use. You can use `ollama run <model>` to both pull and run the model, or just `ollama pull <model>` to download the model. You can find available models at https://ollama.com/models. If you want to run a model from HuggingFace (that is suitably formatted), you can use `ollama pull hf.co/<model>`. You can also [import](https://github.com/ollama/ollama/blob/main/docs/import.md) models from different sources.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5285