[GH-ISSUE #11732] Repeatedly getting PROTOCOL_ERROR when running the installation curl command #33530

Open
opened 2026-04-22 16:19:38 -05:00 by GiteaMirror · 21 comments
Owner

Originally created by @johann-petrak on GitHub (Aug 6, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11732

What is the issue?

This happened on two different linux machines: running curl -fsSL https://ollama.com/install.sh | sh produces

>>> Installing ollama to /usr/local
>>> Downloading Linux amd64 bundle
##############                                                            20.8%curl: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1)


gzip: stdin: unexpected end of file

On the second machine only the percentage was different and it was stream 0 instead of stream 1.

Relevant log output


OS

Ubuntu 24.04.2 LTS

GPU

not relevant

CPU

not relevant

Ollama version

tried to install the version as of today 2025-08-06 (does not get shown which version is being installed)

Originally created by @johann-petrak on GitHub (Aug 6, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11732 ### What is the issue? This happened on two different linux machines: running `curl -fsSL https://ollama.com/install.sh | sh` produces ``` >>> Installing ollama to /usr/local >>> Downloading Linux amd64 bundle ############## 20.8%curl: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1) gzip: stdin: unexpected end of file ``` On the second machine only the percentage was different and it was stream 0 instead of stream 1. ### Relevant log output ```shell ``` ### OS Ubuntu 24.04.2 LTS ### GPU not relevant ### CPU not relevant ### Ollama version tried to install the version as of today 2025-08-06 (does not get shown which version is being installed)
GiteaMirror added the bug label 2026-04-22 16:19:38 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 6, 2025):

Does manually pulling the archive work?

curl -LO https://ollama.com/download/ollama-linux-amd64.tgz
<!-- gh-comment-id:3159959381 --> @rick-github commented on GitHub (Aug 6, 2025): Does manually pulling the archive work? ``` curl -LO https://ollama.com/download/ollama-linux-amd64.tgz ```
Author
Owner

@johann-petrak commented on GitHub (Aug 6, 2025):

I tried that twice and it worked without a problem. Hard to say if there was just some internet glitch in the connection to the server earlier, but what I found a bit frustrating is that if this fails, apparently one has to start from scratch without the option to just continue a partial download of a fairly large file.

<!-- gh-comment-id:3160800731 --> @johann-petrak commented on GitHub (Aug 6, 2025): I tried that twice and it worked without a problem. Hard to say if there was just some internet glitch in the connection to the server earlier, but what I found a bit frustrating is that if this fails, apparently one has to start from scratch without the option to just continue a partial download of a fairly large file.
Author
Owner

@Marhy172 commented on GitHub (Aug 6, 2025):

I got the same error. The error also occurred while downloading Linux amd64 bundle on the first try. On the second try one step further:

>>> Cleaning up old version at /usr/local/lib/ollama
>>> Installing ollama to /usr/local
>>> Downloading Linux amd64 bundle
######################################################################## 100.0%
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
>>> Downloading Linux ROCm amd64 bundle
####################################################################      95.8%curl: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1)

My OS is Ubuntu 24.04.2 LTS and I tried to install today's version. Install worked on the 3rd try for me.

<!-- gh-comment-id:3160806430 --> @Marhy172 commented on GitHub (Aug 6, 2025): I got the same error. The error also occurred while downloading Linux amd64 bundle on the first try. On the second try one step further: ``` >>> Cleaning up old version at /usr/local/lib/ollama >>> Installing ollama to /usr/local >>> Downloading Linux amd64 bundle ######################################################################## 100.0% >>> Adding ollama user to render group... >>> Adding ollama user to video group... >>> Adding current user to ollama group... >>> Creating ollama systemd service... >>> Enabling and starting ollama service... >>> Downloading Linux ROCm amd64 bundle #################################################################### 95.8%curl: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1) ``` My OS is Ubuntu 24.04.2 LTS and I tried to install today's version. Install worked on the 3rd try for me.
Author
Owner

@rick-github commented on GitHub (Aug 6, 2025):

When it was downloading and failed, was it slower (hash marks appeared slower) than when it succeeded?

<!-- gh-comment-id:3160918230 --> @rick-github commented on GitHub (Aug 6, 2025): When it was downloading and failed, was it slower (hash marks appeared slower) than when it succeeded?
Author
Owner

@enterra2010 commented on GitHub (Aug 7, 2025):

I am also seeing this. After looking in to the issue, it appears that the protocol error might relate to TLS version?

Trying:
curl https://ollama.com/install.sh --http1.1 | sh

<!-- gh-comment-id:3165628811 --> @enterra2010 commented on GitHub (Aug 7, 2025): I am also seeing this. After looking in to the issue, it appears that the protocol error might relate to TLS version? Trying: ``curl https://ollama.com/install.sh --http1.1 | sh``
Author
Owner

@enterra2010 commented on GitHub (Aug 7, 2025):

I will also say that the curl has been progressing to a random percentage each time before failing and is VERY slow. Will update if I find success.

<!-- gh-comment-id:3165632256 --> @enterra2010 commented on GitHub (Aug 7, 2025): I will also say that the curl has been progressing to a random percentage each time before failing and is VERY slow. Will update if I find success.
Author
Owner

@rick-github commented on GitHub (Aug 7, 2025):

I believe what's happening is a server timeout. If the download is very slow, the server closes the connection after some time and the download fails. I will do some testing to see if my guess holds water.

<!-- gh-comment-id:3165651723 --> @rick-github commented on GitHub (Aug 7, 2025): I believe what's happening is a server timeout. If the download is very slow, the server closes the connection after some time and the download fails. I will do some testing to see if my guess holds water.
Author
Owner

@johann-petrak commented on GitHub (Aug 8, 2025):

I believe what's happening is a server timeout. If the download is very slow, the server closes the connection after some time and the download fails. I will do some testing to see if my guess holds water.

This is also my impression. The problem is that when that happens, a large portion of the archive may already have been downloaded and the current implementation makes no attempt to resume the download and instead starts from scratch which makes it not just slower for the user but also puts unnecessary additional load on the server.

<!-- gh-comment-id:3166813821 --> @johann-petrak commented on GitHub (Aug 8, 2025): > I believe what's happening is a server timeout. If the download is very slow, the server closes the connection after some time and the download fails. I will do some testing to see if my guess holds water. This is also my impression. The problem is that when that happens, a large portion of the archive may already have been downloaded and the current implementation makes no attempt to resume the download and instead starts from scratch which makes it not just slower for the user but also puts unnecessary additional load on the server.
Author
Owner

@enterra2010 commented on GitHub (Aug 8, 2025):

It seems that you were both correct. I tried again this morning and it worked without issue. I might guess this is related to the newly released OpenAI models.

<!-- gh-comment-id:3167732284 --> @enterra2010 commented on GitHub (Aug 8, 2025): It seems that you were both correct. I tried again this morning and it worked without issue. I might guess this is related to the newly released OpenAI models.
Author
Owner

@WaxArsatia commented on GitHub (Aug 14, 2025):

Same here, will try next day, let's see if issue persist

<!-- gh-comment-id:3188105500 --> @WaxArsatia commented on GitHub (Aug 14, 2025): Same here, will try next day, let's see if issue persist
Author
Owner

@Murtaza1511 commented on GitHub (Aug 15, 2025):

I also tried multiple times and was failing but today I tried with by adding sudo in prefix of the command and it worked for me. Do give it a try

Image

sudo curl -fsSL https://ollama.com/install.sh | sh

<!-- gh-comment-id:3190600022 --> @Murtaza1511 commented on GitHub (Aug 15, 2025): I also tried multiple times and was failing but today I tried with by adding sudo in prefix of the command and it worked for me. Do give it a try <img width="970" height="505" alt="Image" src="https://github.com/user-attachments/assets/a490618d-6766-4801-85be-1bcf1fc2c91d" /> `sudo curl -fsSL https://ollama.com/install.sh | sh`
Author
Owner

@SamuelNLP commented on GitHub (Aug 18, 2025):

I was only able to install it by downloading the tar.gz manually and then updating the sh install script:

curl -C - -L --http1.1 --no-alpn --retry 5 --retry-delay 5 https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
<!-- gh-comment-id:3194944118 --> @SamuelNLP commented on GitHub (Aug 18, 2025): I was only able to install it by downloading the `tar.gz` manually and then updating the `sh` install script: ``` curl -C - -L --http1.1 --no-alpn --retry 5 --retry-delay 5 https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz ```
Author
Owner

@eroyee commented on GitHub (Sep 1, 2025):

I too have had this issue, am on a slow connection here so it could well be related to that as others have suggested. I've dealt with it in a very 'hacky' way, but it works for me and may be of interest to some:

Download install.sh directly (wget https://ollama.ai/install.sh) and at about line 82 edit the script to use wget in place of curl as follows:

#curl --fail --show-error --location --progress-bar \
#"https://ollama.com/download/ollama-linux-${ARCH}.tgz${VER_PARAM}" | \
wget https://ollama.com/download/ollama-linux-amd64.tgz
$SUDO tar -xzf ollama-linux-${ARCH}.tgz${VER_PARAM} -C "$OLLAMA_INSTALL_DIR"

Make it executable (chmod +x install.sh) then run from the cli: ./install.sh

<!-- gh-comment-id:3240753162 --> @eroyee commented on GitHub (Sep 1, 2025): I too have had this issue, am on a slow connection here so it could well be related to that as others have suggested. I've dealt with it in a very 'hacky' way, but it works for me and may be of interest to some: Download install.sh directly (wget https://ollama.ai/install.sh) and at about line 82 edit the script to use wget in place of curl as follows: ``` #curl --fail --show-error --location --progress-bar \ #"https://ollama.com/download/ollama-linux-${ARCH}.tgz${VER_PARAM}" | \ wget https://ollama.com/download/ollama-linux-amd64.tgz $SUDO tar -xzf ollama-linux-${ARCH}.tgz${VER_PARAM} -C "$OLLAMA_INSTALL_DIR" ``` Make it executable (chmod +x install.sh) then run from the cli: ./install.sh
Author
Owner

@redtux commented on GitHub (Sep 2, 2025):

I too have had this issue, am on a slow connection here so it could well be related to that as others have suggested.

@eroyee, I am periodically experiencing this issue for several weeks now. As I am having broadband and 5G, I don't think it's related to the bandwidth (alone). At least there must be some other issue related to this - e.g., rate limits. 🤷 🤭

🎉 Yesterday, I was able to update Ollama (including ROCm install) on my notebook after a long time.

<!-- gh-comment-id:3247132554 --> @redtux commented on GitHub (Sep 2, 2025): > I too have had this issue, am on a slow connection here so it could well be related to that as others have suggested. @eroyee, I am periodically experiencing this issue for several weeks now. As I am having broadband and 5G, I don't think it's related to the bandwidth (alone). At least there must be some other issue related to this - e.g., rate limits. 🤷 🤭 🎉 Yesterday, I was able to update Ollama (including ROCm install) on my notebook after a long time.
Author
Owner

@nathanru commented on GitHub (Sep 6, 2025):

Had to connect my 60 ethernet cable to finally download it. None of above tricks worked above connected to Wifi.

<!-- gh-comment-id:3260306768 --> @nathanru commented on GitHub (Sep 6, 2025): Had to connect my 60 ethernet cable to finally download it. None of above tricks worked above connected to Wifi.
Author
Owner

@rosenblat commented on GitHub (Sep 12, 2025):

Getting the same error. I've updated Ollama many times before without issue. I tried seven odd times to download and each time it fails at between 68-78 %, with the error:

(92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1)

<!-- gh-comment-id:3283083752 --> @rosenblat commented on GitHub (Sep 12, 2025): Getting the same error. I've updated Ollama many times before without issue. I tried seven odd times to download and each time it fails at between 68-78 %, with the error: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1)
Author
Owner

@wnarifin commented on GitHub (Sep 24, 2025):

I too have had this issue, am on a slow connection here so it could well be related to that as others have suggested. I've dealt with it in a very 'hacky' way, but it works for me and may be of interest to some:

Download install.sh directly (wget https://ollama.ai/install.sh) and at about line 82 edit the script to use wget in place of curl as follows:

#curl --fail --show-error --location --progress-bar \
#"https://ollama.com/download/ollama-linux-${ARCH}.tgz${VER_PARAM}" | \
wget https://ollama.com/download/ollama-linux-amd64.tgz
$SUDO tar -xzf ollama-linux-${ARCH}.tgz${VER_PARAM} -C "$OLLAMA_INSTALL_DIR"

Make it executable (chmod +x install.sh) then run from the cli: ./install.sh

Thanks for this solution. As an improvement, I added "-c" to wget so that it will continue following an interrupted download as follows:
wget -c https://ollama.com/download/ollama-linux-amd64.tgz

<!-- gh-comment-id:3326762544 --> @wnarifin commented on GitHub (Sep 24, 2025): > I too have had this issue, am on a slow connection here so it could well be related to that as others have suggested. I've dealt with it in a very 'hacky' way, but it works for me and may be of interest to some: > > Download install.sh directly (wget https://ollama.ai/install.sh) and at about line 82 edit the script to use wget in place of curl as follows: > > ``` > #curl --fail --show-error --location --progress-bar \ > #"https://ollama.com/download/ollama-linux-${ARCH}.tgz${VER_PARAM}" | \ > wget https://ollama.com/download/ollama-linux-amd64.tgz > $SUDO tar -xzf ollama-linux-${ARCH}.tgz${VER_PARAM} -C "$OLLAMA_INSTALL_DIR" > ``` > > Make it executable (chmod +x install.sh) then run from the cli: ./install.sh Thanks for this solution. As an improvement, I added "-c" to wget so that it will continue following an interrupted download as follows: `wget -c https://ollama.com/download/ollama-linux-amd64.tgz`
Author
Owner

@jack5github commented on GitHub (Sep 27, 2025):

I followed the instructions in the post above and am still getting a download error. Here is a full log of my terminal's output that shows the error in action. The download always fails at exactly 98.3%.

ERROR 618: jwt:expired.

./install.sh

>>> Cleaning up old version at /usr/local/lib/ollama

>>> Installing ollama to /usr/local

>>> Downloading Linux amd64 bundle

--2025-09-27 17:45:15-- https://ollama.com/download/ollama-linux-amd64.tgz

Resolving ollama.com (ollama.com)... 34.36.133.15

Connecting to ollama.com (ollama.com)|34.36.133.15|:443... connected.

HTTP request sent, awaiting response... 307 Temporary Redirect

Location: https://github.com/ollama/ollama/releases/latest/download/ollama-linux-amd64.tgz [following]

--2025-09-27 17:45:15-- https://github.com/ollama/ollama/releases/latest/download/ollama-linux-amd64.tgz

Resolving github.com (github.com)... 4.237.22.38

Connecting to github.com (github.com)|4.237.22.38|:443... connected.

HTTP request sent, awaiting response... 302 Found

Location: https://github.com/ollama/ollama/releases/download/v0.12.3/ollama-linux-amd64.tgz [following]

--2025-09-27 17:45:16-- https://github.com/ollama/ollama/releases/download/v0.12.3/ollama-linux-amd64.tgz

Reusing existing connection to github.com:443.

HTTP request sent, awaiting response... 302 Found

Location: https://release-assets.githubusercontent.com/github-production-release-asset/658928958/9781cd83-4122-48d6-917a-db2859aab1e6?sp=r&sv=2018-11-09&sr=b&spr=https&se=2025-09-27T08%3A46%3A07Z&rscd=attachment%3B+filename%3Dollama-linux-amd64.tgz&rsct=application%2Foctet-stream&skoid=96c2d410-5711-43a1-aedd-ab1947aa7ab0&sktid=398a6654-997b-47e9-b12b-9515b896b4de&skt=2025-09-27T07%3A45%3A16Z&ske=2025-09-27T08%3A46%3A07Z&sks=b&skv=2018-11-09&sig=dPlNWOG1vGk0ewU5GbbKGX0%2FSvQFPpd3Yxukh%2BH7MHo%3D&jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmVsZWFzZS1hc3NldHMuZ2l0aHVidXNlcmNvbnRlbnQuY29tIiwia2V5Ijoia2V5MSIsImV4cCI6MTc1ODk1OTQxNiwibmJmIjoxNzU4OTU5MTE2LCJwYXRoIjoicmVsZWFzZWFzc2V0cHJvZHVjdGlvbi5ibG9iLmNvcmUud2luZG93cy5uZXQifQ.AwJdH_03QRZvURIF7_r_Utnmi8-YtXTLVT5j_eVfnJk&response-content-disposition=attachment%3B%20filename%3Dollama-linux-amd64.tgz&response-content-type=application%2Foctet-stream [following]

--2025-09-27 17:45:16-- https://release-assets.githubusercontent.com/github-production-release-asset/658928958/9781cd83-4122-48d6-917a-db2859aab1e6?sp=r&sv=2018-11-09&sr=b&spr=https&se=2025-09-27T08%3A46%3A07Z&rscd=attachment%3B+filename%3Dollama-linux-amd64.tgz&rsct=application%2Foctet-stream&skoid=96c2d410-5711-43a1-aedd-ab1947aa7ab0&sktid=398a6654-997b-47e9-b12b-9515b896b4de&skt=2025-09-27T07%3A45%3A16Z&ske=2025-09-27T08%3A46%3A07Z&sks=b&skv=2018-11-09&sig=dPlNWOG1vGk0ewU5GbbKGX0%2FSvQFPpd3Yxukh%2BH7MHo%3D&jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmVsZWFzZS1hc3NldHMuZ2l0aHVidXNlcmNvbnRlbnQuY29tIiwia2V5Ijoia2V5MSIsImV4cCI6MTc1ODk1OTQxNiwibmJmIjoxNzU4OTU5MTE2LCJwYXRoIjoicmVsZWFzZWFzc2V0cHJvZHVjdGlvbi5ibG9iLmNvcmUud2luZG93cy5uZXQifQ.AwJdH_03QRZvURIF7_r_Utnmi8-YtXTLVT5j_eVfnJk&response-content-disposition=attachment%3B%20filename%3Dollama-linux-amd64.tgz&response-content-type=application%2Foctet-stream

Resolving release-assets.githubusercontent.com (release-assets.githubusercontent.com)... 185.199.109.133, 185.199.110.133, 185.199.111.133, ...

Connecting to release-assets.githubusercontent.com (release-assets.githubusercontent.com)|185.199.109.133|:443... connected.

HTTP request sent, awaiting response... 200 OK

Length: 1908837793 (1.8G) [application/octet-stream]

Saving to: ‘ollama-linux-amd64.tgz’

ollama-linux-amd64. 98%[==================> ] 1.75G 5.74MB/s in 5m 10s

2025-09-27 17:50:27 (5.77 MB/s) - Connection closed at byte 1876951040. Retrying.

--2025-09-27 17:50:28-- (try: 2) https://release-assets.githubusercontent.com/github-production-release-asset/658928958/9781cd83-4122-48d6-917a-db2859aab1e6?sp=r&sv=2018-11-09&sr=b&spr=https&se=2025-09-27T08%3A46%3A07Z&rscd=attachment%3B+filename%3Dollama-linux-amd64.tgz&rsct=application%2Foctet-stream&skoid=96c2d410-5711-43a1-aedd-ab1947aa7ab0&sktid=398a6654-997b-47e9-b12b-9515b896b4de&skt=2025-09-27T07%3A45%3A16Z&ske=2025-09-27T08%3A46%3A07Z&sks=b&skv=2018-11-09&sig=dPlNWOG1vGk0ewU5GbbKGX0%2FSvQFPpd3Yxukh%2BH7MHo%3D&jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmVsZWFzZS1hc3NldHMuZ2l0aHVidXNlcmNvbnRlbnQuY29tIiwia2V5Ijoia2V5MSIsImV4cCI6MTc1ODk1OTQxNiwibmJmIjoxNzU4OTU5MTE2LCJwYXRoIjoicmVsZWFzZWFzc2V0cHJvZHVjdGlvbi5ibG9iLmNvcmUud2luZG93cy5uZXQifQ.AwJdH_03QRZvURIF7_r_Utnmi8-YtXTLVT5j_eVfnJk&response-content-disposition=attachment%3B%20filename%3Dollama-linux-amd64.tgz&response-content-type=application%2Foctet-stream
Connecting to release-assets.githubusercontent.com (release-assets.githubusercontent.com)|185.199.109.133|:443... connected.

HTTP request sent, awaiting response... 618 jwt:expired

2025-09-27 17:50:28 ERROR 618: jwt:expired.

<!-- gh-comment-id:3341394411 --> @jack5github commented on GitHub (Sep 27, 2025): I followed the instructions in the post above and am still getting a download error. Here is a full log of my terminal's output that shows the error in action. The download always fails at exactly 98.3%. <details> <summary>ERROR 618: jwt:expired.</summary> ./install.sh \>\>\> Cleaning up old version at /usr/local/lib/ollama \>\>\> Installing ollama to /usr/local \>\>\> Downloading Linux amd64 bundle --2025-09-27 17:45:15-- https://ollama.com/download/ollama-linux-amd64.tgz Resolving ollama.com (ollama.com)... 34.36.133.15 Connecting to ollama.com (ollama.com)|34.36.133.15|:443... connected. HTTP request sent, awaiting response... 307 Temporary Redirect Location: https://github.com/ollama/ollama/releases/latest/download/ollama-linux-amd64.tgz [following] --2025-09-27 17:45:15-- https://github.com/ollama/ollama/releases/latest/download/ollama-linux-amd64.tgz Resolving github.com (github.com)... 4.237.22.38 Connecting to github.com (github.com)|4.237.22.38|:443... connected. HTTP request sent, awaiting response... 302 Found Location: https://github.com/ollama/ollama/releases/download/v0.12.3/ollama-linux-amd64.tgz [following] --2025-09-27 17:45:16-- https://github.com/ollama/ollama/releases/download/v0.12.3/ollama-linux-amd64.tgz Reusing existing connection to github.com:443. HTTP request sent, awaiting response... 302 Found Location: https://release-assets.githubusercontent.com/github-production-release-asset/658928958/9781cd83-4122-48d6-917a-db2859aab1e6?sp=r&sv=2018-11-09&sr=b&spr=https&se=2025-09-27T08%3A46%3A07Z&rscd=attachment%3B+filename%3Dollama-linux-amd64.tgz&rsct=application%2Foctet-stream&skoid=96c2d410-5711-43a1-aedd-ab1947aa7ab0&sktid=398a6654-997b-47e9-b12b-9515b896b4de&skt=2025-09-27T07%3A45%3A16Z&ske=2025-09-27T08%3A46%3A07Z&sks=b&skv=2018-11-09&sig=dPlNWOG1vGk0ewU5GbbKGX0%2FSvQFPpd3Yxukh%2BH7MHo%3D&jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmVsZWFzZS1hc3NldHMuZ2l0aHVidXNlcmNvbnRlbnQuY29tIiwia2V5Ijoia2V5MSIsImV4cCI6MTc1ODk1OTQxNiwibmJmIjoxNzU4OTU5MTE2LCJwYXRoIjoicmVsZWFzZWFzc2V0cHJvZHVjdGlvbi5ibG9iLmNvcmUud2luZG93cy5uZXQifQ.AwJdH_03QRZvURIF7_r_Utnmi8-YtXTLVT5j_eVfnJk&response-content-disposition=attachment%3B%20filename%3Dollama-linux-amd64.tgz&response-content-type=application%2Foctet-stream [following] --2025-09-27 17:45:16-- https://release-assets.githubusercontent.com/github-production-release-asset/658928958/9781cd83-4122-48d6-917a-db2859aab1e6?sp=r&sv=2018-11-09&sr=b&spr=https&se=2025-09-27T08%3A46%3A07Z&rscd=attachment%3B+filename%3Dollama-linux-amd64.tgz&rsct=application%2Foctet-stream&skoid=96c2d410-5711-43a1-aedd-ab1947aa7ab0&sktid=398a6654-997b-47e9-b12b-9515b896b4de&skt=2025-09-27T07%3A45%3A16Z&ske=2025-09-27T08%3A46%3A07Z&sks=b&skv=2018-11-09&sig=dPlNWOG1vGk0ewU5GbbKGX0%2FSvQFPpd3Yxukh%2BH7MHo%3D&jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmVsZWFzZS1hc3NldHMuZ2l0aHVidXNlcmNvbnRlbnQuY29tIiwia2V5Ijoia2V5MSIsImV4cCI6MTc1ODk1OTQxNiwibmJmIjoxNzU4OTU5MTE2LCJwYXRoIjoicmVsZWFzZWFzc2V0cHJvZHVjdGlvbi5ibG9iLmNvcmUud2luZG93cy5uZXQifQ.AwJdH_03QRZvURIF7_r_Utnmi8-YtXTLVT5j_eVfnJk&response-content-disposition=attachment%3B%20filename%3Dollama-linux-amd64.tgz&response-content-type=application%2Foctet-stream Resolving release-assets.githubusercontent.com (release-assets.githubusercontent.com)... 185.199.109.133, 185.199.110.133, 185.199.111.133, ... Connecting to release-assets.githubusercontent.com (release-assets.githubusercontent.com)|185.199.109.133|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 1908837793 (1.8G) [application/octet-stream] Saving to: ‘ollama-linux-amd64.tgz’ ollama-linux-amd64. 98%[==================> ] 1.75G 5.74MB/s in 5m 10s 2025-09-27 17:50:27 (5.77 MB/s) - Connection closed at byte 1876951040. Retrying. --2025-09-27 17:50:28-- (try: 2) https://release-assets.githubusercontent.com/github-production-release-asset/658928958/9781cd83-4122-48d6-917a-db2859aab1e6?sp=r&sv=2018-11-09&sr=b&spr=https&se=2025-09-27T08%3A46%3A07Z&rscd=attachment%3B+filename%3Dollama-linux-amd64.tgz&rsct=application%2Foctet-stream&skoid=96c2d410-5711-43a1-aedd-ab1947aa7ab0&sktid=398a6654-997b-47e9-b12b-9515b896b4de&skt=2025-09-27T07%3A45%3A16Z&ske=2025-09-27T08%3A46%3A07Z&sks=b&skv=2018-11-09&sig=dPlNWOG1vGk0ewU5GbbKGX0%2FSvQFPpd3Yxukh%2BH7MHo%3D&jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmVsZWFzZS1hc3NldHMuZ2l0aHVidXNlcmNvbnRlbnQuY29tIiwia2V5Ijoia2V5MSIsImV4cCI6MTc1ODk1OTQxNiwibmJmIjoxNzU4OTU5MTE2LCJwYXRoIjoicmVsZWFzZWFzc2V0cHJvZHVjdGlvbi5ibG9iLmNvcmUud2luZG93cy5uZXQifQ.AwJdH_03QRZvURIF7_r_Utnmi8-YtXTLVT5j_eVfnJk&response-content-disposition=attachment%3B%20filename%3Dollama-linux-amd64.tgz&response-content-type=application%2Foctet-stream Connecting to release-assets.githubusercontent.com (release-assets.githubusercontent.com)|185.199.109.133|:443... connected. HTTP request sent, awaiting response... 618 jwt:expired 2025-09-27 17:50:28 ERROR 618: jwt:expired. </details>
Author
Owner

@eroyee commented on GitHub (Sep 27, 2025):

.. still getting a download error. .. always fails at exactly 98.3%.
ERROR 618: jwt:expired.

Wget should just resume from the 98.3% (so long as you have sufficient disk space) so what happens if you just run the script again, without deleting the partially downloaded file?

Not wanting to get into this side of it too deeply but the overall issue may be a result of changes to the expiration time of the requisite jwt (JSON Web Token), as per the error message you're receiving; thus it could be worth bringing this to the attention of those administering the Ollama site. Otherwise some useful discussion on how this (doesn't!) work along with methodologies to deal with the problem may be found here

<!-- gh-comment-id:3342071992 --> @eroyee commented on GitHub (Sep 27, 2025): > .. still getting a download error. .. always fails at exactly 98.3%. > ERROR 618: jwt:expired. Wget should just resume from the 98.3% (so long as you have sufficient disk space) so what happens if you just run the script again, **without** deleting the partially downloaded file? Not wanting to get into this side of it too deeply but the overall issue may be a result of changes to the expiration time of the requisite jwt (JSON Web Token), as per the error message you're receiving; thus it could be worth bringing this to the attention of those administering the Ollama site. Otherwise some useful discussion on how this (doesn't!) work along with methodologies to deal with the problem may be found [here](https://github.com/orgs/community/discussions/169381)
Author
Owner

@steamed-p0tato commented on GitHub (Sep 28, 2025):

having the same issue from both 5g and wifi
wget is working for now but this issue is extremely weird

<!-- gh-comment-id:3344183520 --> @steamed-p0tato commented on GitHub (Sep 28, 2025): having the same issue from both 5g and wifi wget is working for now but this issue is extremely weird
Author
Owner

@bitranox commented on GitHub (Oct 1, 2025):

the install.sh does not handle flaky connections well - because the curl is directly piped into tar, so it can actually not retry.

here a corrected version of install.sh . There are already several pull requests open since 06.Aug. 2025 for the same issue, but unfortunately not merged.

the attached script should handle it well.

install_ollama_with_retry.sh

<!-- gh-comment-id:3356287767 --> @bitranox commented on GitHub (Oct 1, 2025): the `install.sh` does not handle flaky connections well - because the curl is directly piped into `tar`, so it can actually not retry. here a corrected version of `install.sh` . There are already several pull requests open since 06.Aug. 2025 for the same issue, but unfortunately not merged. the attached script should handle it well. [install_ollama_with_retry.sh](https://github.com/user-attachments/files/22641330/install_ollama_with_retry.sh)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#33530