[GH-ISSUE #696] Offline Installation and Model Download #320

Closed
opened 2026-04-12 09:52:30 -05:00 by GiteaMirror · 12 comments
Owner

Originally created by @OguzcanOzdemir on GitHub (Oct 4, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/696

Hello,

I'm trying to install ollama on an offline Ubuntu computer, Due to the lack of an internet connection, I need guidance on how to perform this installation offline. Additionally, I would like to understand how to download and utilize models on this offline Ubuntu machine.

Here are the specific questions and challenges I'm facing:

Offline Installation:

Is it possible to download all the necessary installation files and dependencies on an online machine and then transfer them to the offline Ubuntu computer?
Can you provide step-by-step instructions for manually installing the software offline?
Are there any specific dependencies or libraries that I need to be aware of for the installation?

Offline Model Usage:
How can I download pre-trained models or data sets for the software offline?
Once the models are downloaded, how can I integrate them with the software and use them?

I would greatly appreciate any guidance or assistance you can provide to help me with this offline installation and model usage.
Thank you in advance for your help!

Originally created by @OguzcanOzdemir on GitHub (Oct 4, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/696 Hello, I'm trying to install ollama on an offline Ubuntu computer, Due to the lack of an internet connection, I need guidance on how to perform this installation offline. Additionally, I would like to understand how to download and utilize models on this offline Ubuntu machine. Here are the specific questions and challenges I'm facing: Offline Installation: Is it possible to download all the necessary installation files and dependencies on an online machine and then transfer them to the offline Ubuntu computer? Can you provide step-by-step instructions for manually installing the software offline? Are there any specific dependencies or libraries that I need to be aware of for the installation? Offline Model Usage: How can I download pre-trained models or data sets for the software offline? Once the models are downloaded, how can I integrate them with the software and use them? I would greatly appreciate any guidance or assistance you can provide to help me with this offline installation and model usage. Thank you in advance for your help!
Author
Owner

@BruceMacD commented on GitHub (Oct 4, 2023):

Hi @OguzcanOzdemir, Ollama will work offline. Here are some install steps.

Offline installation:
This will be possible by downloading the ollama-linux-ARCH binary then moving it onto your offline machine.
You can find the binary in the release assets here:
https://github.com/jmorganca/ollama/releases

If you want to use you GPU you will also need to install the relevant CUDA driver:
https://developer.nvidia.com/cuda-downloads?target_os=Linux&target_arch=x86_64&Distribution=Ubuntu&target_version=20.04&target_type=deb_local

Offline model usage:
The easiest way to do this would be to download the Ollama models on a machine which is connected to the internet, then moving the ~/.ollama directory to the offline machine.

<!-- gh-comment-id:1747360806 --> @BruceMacD commented on GitHub (Oct 4, 2023): Hi @OguzcanOzdemir, Ollama will work offline. Here are some install steps. **Offline installation:** This will be possible by downloading the `ollama-linux-ARCH` binary then moving it onto your offline machine. You can find the binary in the release assets here: https://github.com/jmorganca/ollama/releases If you want to use you GPU you will also need to install the relevant CUDA driver: https://developer.nvidia.com/cuda-downloads?target_os=Linux&target_arch=x86_64&Distribution=Ubuntu&target_version=20.04&target_type=deb_local **Offline model usage:** The easiest way to do this would be to download the Ollama models on a machine which is connected to the internet, then moving the `~/.ollama` directory to the offline machine.
Author
Owner

@Mouradif commented on GitHub (Oct 5, 2023):

What if I already downloaded the model from Meta, where should I put it for ollama to be able to use it?

<!-- gh-comment-id:1749291546 --> @Mouradif commented on GitHub (Oct 5, 2023): What if I already downloaded the model from Meta, where should I put it for ollama to be able to use it?
Author
Owner

@HEI201 commented on GitHub (Mar 14, 2024):

hi, i want to download models from ollama and move to a offline computer to use it in ollama, is there any guide or instructions to flow?

<!-- gh-comment-id:1997778620 --> @HEI201 commented on GitHub (Mar 14, 2024): hi, i want to download models from ollama and move to a offline computer to use it in ollama, is there any guide or instructions to flow?
Author
Owner

@zioalex commented on GitHub (Mar 26, 2024):

Why has this been closed? I do not see any real answer here.
Has anybody an update on how to do it?

<!-- gh-comment-id:2020851607 --> @zioalex commented on GitHub (Mar 26, 2024): Why has this been closed? I do not see any real answer here. Has anybody an update on how to do it?
Author
Owner

@nbfhscl commented on GitHub (Mar 30, 2024):

need an answer

<!-- gh-comment-id:2028102505 --> @nbfhscl commented on GitHub (Mar 30, 2024): need an answer
Author
Owner

@Seedmanc commented on GitHub (Apr 12, 2024):

They've disabled pulling models, now we're doomed.

<!-- gh-comment-id:2051906727 --> @Seedmanc commented on GitHub (Apr 12, 2024): They've disabled pulling models, now we're doomed.
Author
Owner

@Pyenb commented on GitHub (Aug 1, 2024):

For anyone interested on where to put the model after downloading: https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored

But I am still not sure where you can download them just as a standalone model

<!-- gh-comment-id:2262515489 --> @Pyenb commented on GitHub (Aug 1, 2024): For anyone interested on where to put the model after downloading: https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored But I am still not sure where you can download them just as a standalone model
Author
Owner

@Pyenb commented on GitHub (Aug 19, 2024):

Hey there, small update for anyone interested. Since this was still bothering me, I took matters into my own hands and created an Ollama model repository, where you can download the zipped official Ollama models and import them to your offline machine or wherever.

Any feedback is appreciated 👍 More models will be coming soon.

<!-- gh-comment-id:2296135063 --> @Pyenb commented on GitHub (Aug 19, 2024): Hey there, small update for anyone interested. Since this was still bothering me, I took matters into my own hands and created an [Ollama model repository](https://github.com/Pyenb/Ollama-models), where you can download the zipped official Ollama models and import them to your offline machine or wherever. Any feedback is appreciated 👍 More models will be coming soon.
Author
Owner

@MikeB2019x commented on GitHub (Aug 19, 2024):

Hey there, small update for anyone interested. Since this was still bothering me, I took matters into my own hands and created an Ollama model repository, where you can download the zipped official Ollama models and import them to your offline machine or wherever.

Any feedback is appreciated 👍 More models will be coming soon.

Literally just had this need come up an hour ago. Well done!

<!-- gh-comment-id:2297447686 --> @MikeB2019x commented on GitHub (Aug 19, 2024): > Hey there, small update for anyone interested. Since this was still bothering me, I took matters into my own hands and created an [Ollama model repository](https://github.com/Pyenb/Ollama-models), where you can download the zipped official Ollama models and import them to your offline machine or wherever. > > Any feedback is appreciated 👍 More models will be coming soon. Literally _just_ had this need come up an hour ago. Well done!
Author
Owner

@amirrezaDev1378 commented on GitHub (Aug 28, 2024):

If anyone did not find a solution for their problem, I've created this simple app that will give you links to download and model in any size you want from the Ollama registry:

https://github.com/amirrezaDev1378/ollama-model-direct-download

<!-- gh-comment-id:2316085177 --> @amirrezaDev1378 commented on GitHub (Aug 28, 2024): If anyone did not find a solution for their problem, I've created this simple app that will give you links to download and model in any size you want from the Ollama registry: https://github.com/amirrezaDev1378/ollama-model-direct-download
Author
Owner

@yelog commented on GitHub (Oct 12, 2024):

hi, i want to download models from ollama and move to a offline computer to use it in ollama, is there any guide or instructions to flow?

@HEI201 I successfully installed an offline model downloaded from Hugging Face, you can check it out. Installing Ollama offline and loading offline models

<!-- gh-comment-id:2408444550 --> @yelog commented on GitHub (Oct 12, 2024): > hi, i want to download models from ollama and move to a offline computer to use it in ollama, is there any guide or instructions to flow? @HEI201 I successfully installed an offline model downloaded from Hugging Face, you can check it out. [Installing Ollama offline and loading offline models](https://yelog.org/2024/10/10/install-ollama-offline-english/)
Author
Owner

@cgjosephlee commented on GitHub (Oct 21, 2024):

Since ollama has a similar CLI interface to docker, I implement save and load functions to ollama just like docker, for transferring models to remote server without access to internet.
https://github.com/cgjosephlee/ollama-save-load

ollama pull gemma2:2b-instruct-q4_K_M
ollama list

export OLLAMA_MODELS="/path/to/ollama/models"  # change this if you have alternative path
./ollama-save.py gemma2:2b-instruct-q4_K_M | gzip > gemma2.tar.gz
./ollama-load.py gemma2.tar.gz
<!-- gh-comment-id:2425941421 --> @cgjosephlee commented on GitHub (Oct 21, 2024): Since ollama has a similar CLI interface to docker, I implement `save` and `load` functions to ollama just like docker, for transferring models to remote server without access to internet. https://github.com/cgjosephlee/ollama-save-load ```sh ollama pull gemma2:2b-instruct-q4_K_M ollama list export OLLAMA_MODELS="/path/to/ollama/models" # change this if you have alternative path ./ollama-save.py gemma2:2b-instruct-q4_K_M | gzip > gemma2.tar.gz ./ollama-load.py gemma2.tar.gz ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#320