[GH-ISSUE #2448] Linux(WSL Ubuntu) installation curl command fails #1429

Closed
opened 2026-04-12 11:18:18 -05:00 by GiteaMirror · 17 comments
Owner

Originally created by @UeberTimei on GitHub (Feb 11, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2448

Originally assigned to: @dhiltgen on GitHub.

curl -fsSL https://ollama.com/install.sh | sh

This leads to:
curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to ollama.com:443

I tried everything. I reinstalled WSL and set Google DNS.

Originally created by @UeberTimei on GitHub (Feb 11, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2448 Originally assigned to: @dhiltgen on GitHub. curl -fsSL https://ollama.com/install.sh | sh This leads to: curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to ollama.com:443 I tried everything. I reinstalled WSL and set Google DNS.
GiteaMirror added the networkinglinux labels 2026-04-12 11:18:19 -05:00
Author
Owner

@pdevine commented on GitHub (Feb 18, 2024):

I think there was an issue w/ this when we switched from ollama.ai to ollama.com. Can you try it with:

curl -fsSL https://ollama.ai/install.sh | sh
<!-- gh-comment-id:1950999834 --> @pdevine commented on GitHub (Feb 18, 2024): I think there was an issue w/ this when we switched from `ollama.ai` to `ollama.com`. Can you try it with: ``` curl -fsSL https://ollama.ai/install.sh | sh ```
Author
Owner

@UeberTimei commented on GitHub (Feb 19, 2024):

I've installed ollama on WSL but now I can't install models.

pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": dial tcp: lookup registry.ollama.ai on [::1]:53: read udp [::1]:46273->[::1]:53: read: connection refused

<!-- gh-comment-id:1953094516 --> @UeberTimei commented on GitHub (Feb 19, 2024): I've installed ollama on WSL but now I can't install models. pulling manifest Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": dial tcp: lookup registry.ollama.ai on [::1]:53: read udp [::1]:46273->[::1]:53: read: connection refused
Author
Owner

@dhiltgen commented on GitHub (Mar 11, 2024):

Are you perhaps behind a firewall or proxy?

Can you try to curl the site from inside a wsl terminal? Something like curl https://registry.ollama.ai/v2/_catalog (takes a while, but should produce a json blob showing the first few dozen public models)

Perhaps our Windows native install is an option to consider?

<!-- gh-comment-id:1989632942 --> @dhiltgen commented on GitHub (Mar 11, 2024): Are you perhaps behind a firewall or proxy? Can you try to curl the site from inside a wsl terminal? Something like `curl https://registry.ollama.ai/v2/_catalog` (takes a while, but should produce a json blob showing the first few dozen public models) Perhaps our Windows native install is an option to consider?
Author
Owner

@jianotl commented on GitHub (Mar 16, 2024):

root@f09b3869a542:/# ollama run gemma
pulling manifest
Error: pull model manifest: Get "https://ollama.com/token?nonce=7zKHsvSwLKftB_ZFljykiw&scope=repository%!A(MISSING)library%!F(MISSING)gemma%!A(MISSING)pull&service=ollama.com&ts=1710593137": read tcp 172.17.0.6:33406->34.120.132.20:443: read: connection reset by peer

i also get it

<!-- gh-comment-id:2001976993 --> @jianotl commented on GitHub (Mar 16, 2024): root@f09b3869a542:/# ollama run gemma pulling manifest Error: pull model manifest: Get "https://ollama.com/token?nonce=7zKHsvSwLKftB_ZFljykiw&scope=repository%!A(MISSING)library%!F(MISSING)gemma%!A(MISSING)pull&service=ollama.com&ts=1710593137": read tcp 172.17.0.6:33406->34.120.132.20:443: read: connection reset by peer i also get it
Author
Owner

@dhiltgen commented on GitHub (Mar 18, 2024):

@jianotl can you try the curl test I suggested above? My suspicion is there's a firewall or proxy configuration that's preventing network access to the ollama model registry.

<!-- gh-comment-id:2003053366 --> @dhiltgen commented on GitHub (Mar 18, 2024): @jianotl can you try the `curl` test I suggested above? My suspicion is there's a firewall or proxy configuration that's preventing network access to the ollama model registry.
Author
Owner

@mxyng commented on GitHub (Mar 18, 2024):

@UeberTimei port 53 indicates there's a problem with DNS. Make sure DNS is set up correctly in this environment

<!-- gh-comment-id:2003145035 --> @mxyng commented on GitHub (Mar 18, 2024): @UeberTimei port 53 indicates there's a problem with DNS. Make sure DNS is set up correctly in this environment
Author
Owner

@UeberTimei commented on GitHub (Mar 19, 2024):

Are you perhaps behind a firewall or proxy?

Can you try to curl the site from inside a wsl terminal? Something like curl https://registry.ollama.ai/v2/_catalog (takes a while, but should produce a json blob showing the first few dozen public models)

Perhaps our Windows native install is an option to consider?

I'm getting this:
curl: (6) Could not resolve host: registry.ollama.ai

<!-- gh-comment-id:2007072305 --> @UeberTimei commented on GitHub (Mar 19, 2024): > Are you perhaps behind a firewall or proxy? > > Can you try to curl the site from inside a wsl terminal? Something like `curl https://registry.ollama.ai/v2/_catalog` (takes a while, but should produce a json blob showing the first few dozen public models) > > Perhaps our Windows native install is an option to consider? I'm getting this: curl: (6) Could not resolve host: registry.ollama.ai
Author
Owner

@mxyng commented on GitHub (Mar 19, 2024):

curl: (6) Could not resolve host: registry.ollama.ai

This indicates there's a problem with DNS. Please ensure the DNS is set up correctly for the WSL context

<!-- gh-comment-id:2007160280 --> @mxyng commented on GitHub (Mar 19, 2024): > curl: (6) Could not resolve host: registry.ollama.ai This indicates there's a problem with DNS. Please ensure the DNS is set up correctly for the WSL context
Author
Owner

@UeberTimei commented on GitHub (Mar 19, 2024):

This indicates there's a problem with DNS. Please ensure the DNS is set up correctly for the WSL context

Now I'm getting:

pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": read tcp 172.30.173.84:41144->34.120.132.20:443: read: connection reset by peer

And also I tried curl https://registry.ollama.ai/v2/_catalog, and got this:

curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to registry.ollama.ai:443

<!-- gh-comment-id:2007398256 --> @UeberTimei commented on GitHub (Mar 19, 2024): > This indicates there's a problem with DNS. Please ensure the DNS is set up correctly for the WSL context Now I'm getting: pulling manifest Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": read tcp 172.30.173.84:41144->34.120.132.20:443: read: connection reset by peer And also I tried curl https://registry.ollama.ai/v2/_catalog, and got this: curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to registry.ollama.ai:443
Author
Owner

@dhiltgen commented on GitHub (Mar 19, 2024):

@UeberTimei I think this most likely indicates a local network, vpn or proxy problem. The site is online and accessible.

If you're running on Windows, you might consider the native Windows install of Ollama if you're having difficulty setting up WSL2 networking to work correctly in your environment.

<!-- gh-comment-id:2007649159 --> @dhiltgen commented on GitHub (Mar 19, 2024): @UeberTimei I think this most likely indicates a local network, vpn or proxy problem. The site is online and accessible. If you're running on Windows, you might consider the native Windows install of Ollama if you're having difficulty setting up WSL2 networking to work correctly in your environment.
Author
Owner

@chopeen commented on GitHub (Mar 22, 2024):

I am also experiencing the same problem:

$ ollama run llava
pulling manifest 
Error: pull model manifest: Get "https://ollama.com/token?nonce=SW96RgmctcQmHJ37NXJ8KQ&scope=repository%!A(MISSING)library%!F(MISSING)llava%!A(MISSING)pull&service=ollama.com&ts=1711106784": read tcp 10.144.68.189:40600->34.120.132.20:443: read: connection reset by peer

Downloading the catalog is working fine:

$ curl https://registry.ollama.ai/v2/_catalog
{"repositories":["a123/tiny-gpt","abambah/avnit","abhi3linku/test-model","adeelahmad/aws-zephyr-7b-alpha","adrienbrault/biomistral-7b","adrienbrault/gorilla-openfunctions-v2","adrienbrault/lonestriker-senku-70b","adrienbrault/nous-capybara-3b","adrienbrault/nous-hermes2pro","adrienbrault/qwen1.5-0.5b-openhermes-2.5","adrienbrault/saul-instruct-v1","adrienbrault/tess-70b-v1.6","adrienbrault/test-phi2","adrienbrault/test-q1.5-0.5b","adrienbrault/test-tv1b","adrienbrault/testtd","adrienbrault/wolfram-miquliz-120b-v2","adrienbrault/yi-9b-200k","adrienlcq/shad0w59i","aimadeapproachable/genstruct_7b","airat/karen-the-editor-v2-creative","airat/karen-the-editor-v2-strict","aish/svce-ai","aisherpa/codellama-70b-instruct-hf","aisherpa/mistral-7b-instruct-v02","aisherpa/wizardmath-7b-v1.1","alenbijelic/devops","alhu/zephyr-chat","ancerlop/mistral-params","andrewcanis/command-r","anik/copilot_llama2","ankk98/cyrax7b","antony66/saiga_mistral_7b_128k","anxiu0101/campus-hub-bot","apattana/llama2","apattana/new-model","apattana/test-model","apto/elyza","argilla/notus","ariekdev/oy_bro","arsalananwar/academate","asedmammad/contextual-mistral","asedmammad/lhk-dpo","avinish/finetuned-qunatized","azure99/blossom-test","azure99/blossom-v5","bbrittuw/bge-large-en-v1.5","bbrittuw/ggml-sfr-embedding-mistral-q8_0","bdx0/vietcuna","bengt0/em_german_leo_mistral","bhargav/donkey-ai","bhargav/donkey-initial","bhargav/donkey-soql","bigllama/mistralv01-7b","bitbinge/dazz","bitbinge/stickler","bitbinge/zoar","bjaburg/gen4-mistral","bjaburg/porsche-phi","bob/gemma-test-1","boubou77/emojicool","boubou77/zeus","breitburg/tinyllama","bruceignoretest/bruce-test-1","bruceignoretest/bruce-test-2","bruceignoretest/bruce-test-3","bruceignoretest/bruce-test-4","brunneis/hermes-2-pro-mistral-7b-q8-0","brunneis/hermes-2-pro-mistral-7b","brunneis/mistral-7b-instruct-v0.2-q4-k-m","brxce/gemma","brxce/mario","brxce/mario23","brxce/marioo","brxce/monadgpt","brxce/ne","brxce/stable-diffusion-prompt-generator","brxce/test-fix","brxce/whiterabbitneo-33b","c2p/gemma-ai","cagataycali/neuron","calebfahlgren/natural-functions","captainkyd/trinity-13b","captainkyd/whiterabbitneo7b","carotamunoz/mario","cas/alphamonarch-7b","cas/brezn-7b","cas/brezn3","cas/discolm-german-laser","cas/discolm-mfto-german","cas/german-assistant-v7","cas/hermeo-7b","cas/kafkalm-7b-dare-ties-laserrmt-qlora-dpo-v0.5","cas/kafkalm-7b-german-v0.1","cas/minicpm-3b-hephaestus","cas/minicpm-3b-openhermes-2.5-v2","cas/minicpm-3b-turangga-v3-ep50","cas/mistral-ft-optimized-1227","cas/mistral-instruct-v0.2-2x7b-moe","cas/mixtral_11bx2_moe"]}

Port 443 suggests an SSL issue and I have some company self-signed certificates added to /etc/ssl/certs/ca-certificates.crt, so that the OS trusts them.

  1. Is it possible that the Ollama application rejects them nonetheless?
  2. Is it possible to run Ollama in verbose mode or check details in any log file?
<!-- gh-comment-id:2014888955 --> @chopeen commented on GitHub (Mar 22, 2024): I am also experiencing the same problem: ``` $ ollama run llava pulling manifest Error: pull model manifest: Get "https://ollama.com/token?nonce=SW96RgmctcQmHJ37NXJ8KQ&scope=repository%!A(MISSING)library%!F(MISSING)llava%!A(MISSING)pull&service=ollama.com&ts=1711106784": read tcp 10.144.68.189:40600->34.120.132.20:443: read: connection reset by peer ``` Downloading the catalog is working fine: ``` $ curl https://registry.ollama.ai/v2/_catalog {"repositories":["a123/tiny-gpt","abambah/avnit","abhi3linku/test-model","adeelahmad/aws-zephyr-7b-alpha","adrienbrault/biomistral-7b","adrienbrault/gorilla-openfunctions-v2","adrienbrault/lonestriker-senku-70b","adrienbrault/nous-capybara-3b","adrienbrault/nous-hermes2pro","adrienbrault/qwen1.5-0.5b-openhermes-2.5","adrienbrault/saul-instruct-v1","adrienbrault/tess-70b-v1.6","adrienbrault/test-phi2","adrienbrault/test-q1.5-0.5b","adrienbrault/test-tv1b","adrienbrault/testtd","adrienbrault/wolfram-miquliz-120b-v2","adrienbrault/yi-9b-200k","adrienlcq/shad0w59i","aimadeapproachable/genstruct_7b","airat/karen-the-editor-v2-creative","airat/karen-the-editor-v2-strict","aish/svce-ai","aisherpa/codellama-70b-instruct-hf","aisherpa/mistral-7b-instruct-v02","aisherpa/wizardmath-7b-v1.1","alenbijelic/devops","alhu/zephyr-chat","ancerlop/mistral-params","andrewcanis/command-r","anik/copilot_llama2","ankk98/cyrax7b","antony66/saiga_mistral_7b_128k","anxiu0101/campus-hub-bot","apattana/llama2","apattana/new-model","apattana/test-model","apto/elyza","argilla/notus","ariekdev/oy_bro","arsalananwar/academate","asedmammad/contextual-mistral","asedmammad/lhk-dpo","avinish/finetuned-qunatized","azure99/blossom-test","azure99/blossom-v5","bbrittuw/bge-large-en-v1.5","bbrittuw/ggml-sfr-embedding-mistral-q8_0","bdx0/vietcuna","bengt0/em_german_leo_mistral","bhargav/donkey-ai","bhargav/donkey-initial","bhargav/donkey-soql","bigllama/mistralv01-7b","bitbinge/dazz","bitbinge/stickler","bitbinge/zoar","bjaburg/gen4-mistral","bjaburg/porsche-phi","bob/gemma-test-1","boubou77/emojicool","boubou77/zeus","breitburg/tinyllama","bruceignoretest/bruce-test-1","bruceignoretest/bruce-test-2","bruceignoretest/bruce-test-3","bruceignoretest/bruce-test-4","brunneis/hermes-2-pro-mistral-7b-q8-0","brunneis/hermes-2-pro-mistral-7b","brunneis/mistral-7b-instruct-v0.2-q4-k-m","brxce/gemma","brxce/mario","brxce/mario23","brxce/marioo","brxce/monadgpt","brxce/ne","brxce/stable-diffusion-prompt-generator","brxce/test-fix","brxce/whiterabbitneo-33b","c2p/gemma-ai","cagataycali/neuron","calebfahlgren/natural-functions","captainkyd/trinity-13b","captainkyd/whiterabbitneo7b","carotamunoz/mario","cas/alphamonarch-7b","cas/brezn-7b","cas/brezn3","cas/discolm-german-laser","cas/discolm-mfto-german","cas/german-assistant-v7","cas/hermeo-7b","cas/kafkalm-7b-dare-ties-laserrmt-qlora-dpo-v0.5","cas/kafkalm-7b-german-v0.1","cas/minicpm-3b-hephaestus","cas/minicpm-3b-openhermes-2.5-v2","cas/minicpm-3b-turangga-v3-ep50","cas/mistral-ft-optimized-1227","cas/mistral-instruct-v0.2-2x7b-moe","cas/mixtral_11bx2_moe"]} ``` Port `443` suggests an SSL issue and I have some company self-signed certificates added to `/etc/ssl/certs/ca-certificates.crt`, so that the OS trusts them. 1. Is it possible that the Ollama application rejects them nonetheless? 2. Is it possible to run Ollama in verbose mode or check details in any log file?
Author
Owner

@dhiltgen commented on GitHub (Mar 24, 2024):

Is it possible that the Ollama application rejects them (self signed proxy certs) nonetheless?

This sounds like a plausible explanation. That said, it sounds like you updated the expected file for ubuntu.

https://go.dev/src/crypto/x509/root_linux.go

You might want to try a few other paths from that file and see if one works. If not, it may require code changes to ollama to be able to adjust how we're establishing SSL connections.

Is it possible to run Ollama in verbose mode or check details in any log file?

We do have OLLAMA_DEBUG=1 for turning up verbosity, but I don't think that will yield information about TLS trusted root cert lookups.

<!-- gh-comment-id:2016908575 --> @dhiltgen commented on GitHub (Mar 24, 2024): > Is it possible that the Ollama application rejects them (self signed proxy certs) nonetheless? This sounds like a plausible explanation. That said, it sounds like you updated the expected file for ubuntu. https://go.dev/src/crypto/x509/root_linux.go You might want to try a few other paths from that file and see if one works. If not, it may require code changes to ollama to be able to adjust how we're establishing SSL connections. > Is it possible to run Ollama in verbose mode or check details in any log file? We do have `OLLAMA_DEBUG=1` for turning up verbosity, but I don't think that will yield information about TLS trusted root cert lookups.
Author
Owner

@ccsyt1205 commented on GitHub (Mar 28, 2024):

@dhiltgen Sorry to disturb you bro, I encountered a problem while executing 'ollama pull gemma' behind a proxy, is there any solution to specify a proxy? Maybe like this: ollama pull gemma --proxy "http:xxxxxx.com.cn"?

<!-- gh-comment-id:2024342345 --> @ccsyt1205 commented on GitHub (Mar 28, 2024): @dhiltgen Sorry to disturb you bro, I encountered a problem while executing 'ollama pull gemma' behind a proxy, is there any solution to specify a proxy? Maybe like this: ollama pull gemma --proxy "http:xxxxxx.com.cn"?
Author
Owner

@chopeen commented on GitHub (Mar 28, 2024):

@ccsyt1205 I think you are looking for HTTP_PROXY or HTTPS_PROXY - see faq.md#how-do-i-use-ollama-behind-a-proxy for details.

<!-- gh-comment-id:2024701372 --> @chopeen commented on GitHub (Mar 28, 2024): @ccsyt1205 I think you are looking for `HTTP_PROXY` or `HTTPS_PROXY` - see [faq.md#how-do-i-use-ollama-behind-a-proxy](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy) for details.
Author
Owner

@chopeen commented on GitHub (Mar 28, 2024):

@dhiltgen The error happens in this line: 756c257553/server/images.go (L1204-L1207).

I added slog.Error(fmt.Sprintf("http.DefaultClient.Do(req) failed with error: %s", err)) and got the following output:

time=2024-03-28T09:51:41.142+01:00 level=ERROR source=images.go:1206 msg="http.DefaultClient.Do(req) failed with error: Get \"https://ollama.com/token?nonce=wxlLnVBWIaC0XGORVmYxbg&scope=repository%3Alibrary%2Fqwen%3Apull&service=ollama.com&ts=1711615900\": read tcp 10.144.68.189:54912->34.120.132.20:443: read: connection reset by peer"

Any suggestions how to log more details to understand the root cause?


I tried to create a minimal reproduction example, but all these snippets run fine on my laptop inside the corporate network:

At the same time, ollama run ... fails on the same laptop, inside the same network. It would seem that plain Go programs trust the self-signed certificates added to the OS cert store, while Ollama does not.

<!-- gh-comment-id:2024709118 --> @chopeen commented on GitHub (Mar 28, 2024): @dhiltgen The error happens in this line: https://github.com/ollama/ollama/blob/756c2575535641f1b96d94b4214941b90f4c30c7/server/images.go#L1204-L1207. I added `slog.Error(fmt.Sprintf("http.DefaultClient.Do(req) failed with error: %s", err))` and got the following output: ``` time=2024-03-28T09:51:41.142+01:00 level=ERROR source=images.go:1206 msg="http.DefaultClient.Do(req) failed with error: Get \"https://ollama.com/token?nonce=wxlLnVBWIaC0XGORVmYxbg&scope=repository%3Alibrary%2Fqwen%3Apull&service=ollama.com&ts=1711615900\": read tcp 10.144.68.189:54912->34.120.132.20:443: read: connection reset by peer" ``` Any suggestions how to log more details to understand the root cause? --- I tried to create a minimal reproduction example, but all these snippets run fine on my laptop inside the corporate network: - https://gist.github.com/chopeen/8b2943db321d25d338e20f6283430a2e - https://gist.github.com/chopeen/f4e0f4a9b188d520da9d289acaee0e49 - https://gist.github.com/chopeen/a47bb9a55464acd218c6e6949823503f At the same time, `ollama run ...` fails on the same laptop, inside the same network. It would seem that plain Go programs trust the self-signed certificates added to the OS cert store, while Ollama does not.
Author
Owner

@chopeen commented on GitHub (Mar 28, 2024):

I realized that Ollama used to work fine for me in the office, so I downgraded Ollama to v0.1.27 and the problem is gone. 🎉

Installing v0.1.28 (or later) means I cannot pull images at the office. 🙁

@dhiltgen I looked at v0.1.27...v0.1.28 and the removal of format/openssh.go caught my attention in fd10a2ad4b. There are also changes in initializeKeypair. I can see a public and private key pair it creates under ~/.ollama, what are these keys used for?

<!-- gh-comment-id:2025126526 --> @chopeen commented on GitHub (Mar 28, 2024): I realized that Ollama used to work fine for me in the office, so I **downgraded** Ollama to `v0.1.27` and the problem is **gone**. 🎉 Installing `v0.1.28` (or later) means I cannot pull images at the office. 🙁 @dhiltgen I looked at [v0.1.27...v0.1.28](https://github.com/ollama/ollama/compare/v0.1.27...v0.1.28) and the removal of `format/openssh.go` caught my attention in fd10a2ad4b141ee3117f6a110046f003dbba1b05. There are also changes in `initializeKeypair`. I can see a public and private key pair it creates under `~/.ollama`, what are these keys used for?
Author
Owner

@mxyng commented on GitHub (Mar 28, 2024):

#2719 is unrelated. Furthermore, this issue has gotten off topic. The original issue deals with fetching the install script which has been resolved. I'm now going to close this issue.

For further discussion on pull errors, specifically connections being forcefully closed, please see #3112. Other similar issues will be closed as duplicates

<!-- gh-comment-id:2026093438 --> @mxyng commented on GitHub (Mar 28, 2024): #2719 is unrelated. Furthermore, this issue has gotten off topic. The original issue deals with fetching the install script which has been resolved. I'm now going to close this issue. For further discussion on pull errors, specifically connections being forcefully closed, please see #3112. Other similar issues will be closed as duplicates
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1429