[GH-ISSUE #2491] How to install ollama on ubuntu with specific version #47966

Closed
opened 2026-04-28 06:12:34 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @MugdhaHardikar-GSLab on GitHub (Feb 14, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2491

I want to install the ollama on my ubuntu server but every few days new version of ollama gets installed. I want to fix the version of the ollama getting installed on my machine. Current install.sh doesn't seem to have that functionality. IS there any way?

Originally created by @MugdhaHardikar-GSLab on GitHub (Feb 14, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2491 I want to install the ollama on my ubuntu server but every few days new version of ollama gets installed. I want to fix the version of the ollama getting installed on my machine. Current install.sh doesn't seem to have that functionality. IS there any way?
Author
Owner

@remy415 commented on GitHub (Feb 14, 2024):

What do you mean by fix the version of the ollama getting installed? Do you mean you want to keep a specific version installed?

<!-- gh-comment-id:1943869353 --> @remy415 commented on GitHub (Feb 14, 2024): What do you mean by `fix the version of the ollama getting installed`? Do you mean you want to keep a specific version installed?
Author
Owner

@MugdhaHardikar-GSLab commented on GitHub (Feb 15, 2024):

Yes. Previously I was using ollama 0.1.20, now I am getting 0.1.24. It want to stick to one version even if latest versions come up lets say too 0.1.20. How to do it.

Latest install giving me:
root@5882e714c697:/usr/src# ollama --version
ollama version is 0.1.24

<!-- gh-comment-id:1945483082 --> @MugdhaHardikar-GSLab commented on GitHub (Feb 15, 2024): Yes. Previously I was using ollama 0.1.20, now I am getting 0.1.24. It want to stick to one version even if latest versions come up lets say too 0.1.20. How to do it. Latest install giving me: root@5882e714c697:/usr/src# ollama --version ollama version is 0.1.24
Author
Owner

@jmorganca commented on GitHub (Feb 20, 2024):

By default Ollama won't auto upgrade on Linux

However, you can run this script to install a previous version:

curl -fsSL https://ollama.com/install.sh | sed 's#https://ollama.com/download#https://github.com/jmorganca/ollama/releases/download/v0.1.25#' | sh

Note this is experimental and may not work forever

<!-- gh-comment-id:1953450870 --> @jmorganca commented on GitHub (Feb 20, 2024): By default Ollama won't auto upgrade on Linux However, you can run this script to install a previous version: ``` curl -fsSL https://ollama.com/install.sh | sed 's#https://ollama.com/download#https://github.com/jmorganca/ollama/releases/download/v0.1.25#' | sh ``` Note this is experimental and may not work forever
Author
Owner

@telemetrieTP23 commented on GitHub (Feb 20, 2024):

for me your commando for installing specific version does not work anymore, it allways installs the actual version (0.1.25) on my jetson orin AGX
even if i use:

curl -fsSL https://ollama.com/install.sh | sed 's#https://ollama.com/download#https://github.com/jmorganca/ollama/releases/download/v0.1.27#' | sh

on the jetson xavier agx i used it to install 0.1.17, after i recognized that starting with 0.1.18 it doesnt find the gpu drivers anymore, so i downgraded.

<!-- gh-comment-id:1954578161 --> @telemetrieTP23 commented on GitHub (Feb 20, 2024): for me your commando for installing specific version does not work anymore, it allways installs the actual version (0.1.25) on my jetson orin AGX even if i use: curl -fsSL https://ollama.com/install.sh | sed 's#https://ollama.com/download#https://github.com/jmorganca/ollama/releases/download/v0.1.27#' | sh on the jetson xavier agx i used it to install 0.1.17, after i recognized that starting with 0.1.18 it doesnt find the gpu drivers anymore, so i downgraded.
Author
Owner

@remy415 commented on GitHub (Feb 20, 2024):

@telemetrieTP23 I'm working on adding Jetson support. In the mean time, I have a preliminary build available that should work on your Orin AGX until it's fully integrated into the official release: https://github.com/remy415/ollama.

To save you time, ensure that you set the following environment variables:
export LD_LIBRARY_PATH="/usr/local/cuda/lib64:/usr/local/cuda/compat:/usr/local/cuda/include"
export OLLAMA_SKIP_CPU_GENERATE="1"

Also set one of the following based on which Jetpack you are using:

L4T_VERSION.major >= 36: # JetPack 6
export CMAKE_CUDA_ARCHITECTURES="87"

L4T_VERSION.major >= 34: # JetPack 5
export CMAKE_CUDA_ARCHITECTURES="72;87"

L4T_VERSION.major == 32: # JetPack 4
export CMAKE_CUDA_ARCHITECTURES="53;62;72"

<!-- gh-comment-id:1954615027 --> @remy415 commented on GitHub (Feb 20, 2024): @telemetrieTP23 I'm working on adding Jetson support. In the mean time, I have a preliminary build available that should work on your Orin AGX until it's fully integrated into the official release: [https://github.com/remy415/ollama](https://github.com/remy415/ollama). To save you time, ensure that you set the following environment variables: `export LD_LIBRARY_PATH="/usr/local/cuda/lib64:/usr/local/cuda/compat:/usr/local/cuda/include"` `export OLLAMA_SKIP_CPU_GENERATE="1"` Also set one of the following based on which Jetpack you are using: L4T_VERSION.major >= 36: # JetPack 6 `export CMAKE_CUDA_ARCHITECTURES="87"` L4T_VERSION.major >= 34: # JetPack 5 `export CMAKE_CUDA_ARCHITECTURES="72;87"` L4T_VERSION.major == 32: # JetPack 4 `export CMAKE_CUDA_ARCHITECTURES="53;62;72"`
Author
Owner

@remy415 commented on GitHub (Feb 21, 2024):

@telemetrieTP23 yes, you should be able to build from my fork.

I made a typo on the exports for CUDA_ARCHITECTURE. I corrected my original post, sorry about that.

It’s export CMAKE_CUDA_ARCHITECTURES

<!-- gh-comment-id:1958281900 --> @remy415 commented on GitHub (Feb 21, 2024): @telemetrieTP23 yes, you should be able to build from my fork. I made a typo on the exports for CUDA_ARCHITECTURE. I corrected my original post, sorry about that. It’s `export CMAKE_CUDA_ARCHITECTURES`
Author
Owner

@kungfu-eric commented on GitHub (Apr 30, 2024):

This works now:

curl -fsSL https://ollama.com/install.sh | sed 's#https://ollama.com/download/ollama-linux-${ARCH}${VER_PARAM}#https://github.com/ollama/ollama/releases/download/v0.1.33-rc5/ollama-linux-amd64#' | sh

By default Ollama won't auto upgrade on Linux

However, you can run this script to install a previous version:

curl -fsSL https://ollama.com/install.sh | sed 's#https://ollama.com/download#https://github.com/jmorganca/ollama/releases/download/v0.1.25#' | sh

Note this is experimental and may not work forever

<!-- gh-comment-id:2085891533 --> @kungfu-eric commented on GitHub (Apr 30, 2024): This works now: ``` curl -fsSL https://ollama.com/install.sh | sed 's#https://ollama.com/download/ollama-linux-${ARCH}${VER_PARAM}#https://github.com/ollama/ollama/releases/download/v0.1.33-rc5/ollama-linux-amd64#' | sh ``` > By default Ollama won't auto upgrade on Linux > > However, you can run this script to install a previous version: > > ``` > curl -fsSL https://ollama.com/install.sh | sed 's#https://ollama.com/download#https://github.com/jmorganca/ollama/releases/download/v0.1.25#' | sh > ``` > > Note this is experimental and may not work forever
Author
Owner

@thyarles commented on GitHub (Jan 21, 2025):

Hi there! Just call like this

OLLAMA_VERSION=0.5.6 curl -fsSL https://ollama.com/install.sh | sh
<!-- gh-comment-id:2605424892 --> @thyarles commented on GitHub (Jan 21, 2025): Hi there! Just call like this ``` OLLAMA_VERSION=0.5.6 curl -fsSL https://ollama.com/install.sh | sh ```
Author
Owner

@ABCdatos commented on GitHub (Oct 11, 2025):

Working actually, as stated in https://docs.ollama.com/linux :

curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.11.11 sh

<!-- gh-comment-id:3393704360 --> @ABCdatos commented on GitHub (Oct 11, 2025): Working actually, as stated in [https://docs.ollama.com/linux](https://docs.ollama.com/linux) : `curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.11.11 sh`
Author
Owner

@amitthk commented on GitHub (Jan 28, 2026):

Working actually, as stated in https://docs.ollama.com/linux :

curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.11.11 sh

This command didn't work for me. It installs:

$ollama --version
ollama version is 0.14.2
Warning: client version is 0.13.3

Here's what worked for me:

I went to releases page and found the link to tarball compatible for my machine and os: https://github.com/ollama/ollama/releases

# 1. Force stop and purge the snap (skipping snapshots)
sudo snap stop ollama
sudo snap remove --purge ollama

# 2. Kill any remaining background processes
sudo killall ollama 2>/dev/null

# 3. Prepare directory for ollama
sudo mkdir -p /usr/share/ollama

# 4. Downloaded the 0.13.3 tarball (change 0.13.3 with the version you would prefer)
curl -L https://github.com/ollama/ollama/releases/download/v0.13.3/ollama-linux-amd64.tgz -o ollama-0.13.3.tgz

# 5. Extract to /usr/share/ollama
sudo tar -C /usr/share/ollama -xzf ollama-0.13.3.tgz

# 6. Create the symlink so 'ollama' command works globally
sudo ln -sf /usr/share/ollama/bin/ollama /usr/local/bin/ollama

# 7. Create the service file
sudo bash -c 'cat <<EOF > /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=root
Group=root
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"

[Install]
WantedBy=default.target
EOF'

# 8. Reload, enable, and start the service
sudo systemctl daemon-reload
sudo systemctl enable ollama
sudo systemctl start ollama

<!-- gh-comment-id:3808544894 --> @amitthk commented on GitHub (Jan 28, 2026): > Working actually, as stated in https://docs.ollama.com/linux : > > `curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.11.11 sh` This command didn't work for me. It installs: ``` $ollama --version ollama version is 0.14.2 Warning: client version is 0.13.3 ``` Here's what worked for me: I went to releases page and found the link to tarball compatible for my machine and os: https://github.com/ollama/ollama/releases ``` # 1. Force stop and purge the snap (skipping snapshots) sudo snap stop ollama sudo snap remove --purge ollama # 2. Kill any remaining background processes sudo killall ollama 2>/dev/null # 3. Prepare directory for ollama sudo mkdir -p /usr/share/ollama # 4. Downloaded the 0.13.3 tarball (change 0.13.3 with the version you would prefer) curl -L https://github.com/ollama/ollama/releases/download/v0.13.3/ollama-linux-amd64.tgz -o ollama-0.13.3.tgz # 5. Extract to /usr/share/ollama sudo tar -C /usr/share/ollama -xzf ollama-0.13.3.tgz # 6. Create the symlink so 'ollama' command works globally sudo ln -sf /usr/share/ollama/bin/ollama /usr/local/bin/ollama # 7. Create the service file sudo bash -c 'cat <<EOF > /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=root Group=root Restart=always RestartSec=3 Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" [Install] WantedBy=default.target EOF' # 8. Reload, enable, and start the service sudo systemctl daemon-reload sudo systemctl enable ollama sudo systemctl start ollama ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#47966