[GH-ISSUE #5844] Connection refused on registry.ollama.ai #3645

Closed
opened 2026-04-12 14:25:44 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @jcpraud on GitHub (Jul 22, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5844

What is the issue?

Hi,

Since this morning I get a Connection refused error when trying to pull models:
ollama pull nomic-embed-text:137m-v1.5-fp16
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/nomic-embed-text/manifests/137m-v1.5-fp16": dial tcp 172.67.182.229:443: connect: connection refused

I'm behind a proxy which worked well last week (http_proxy and https_proxy env variables are correct)

With my browser (behind the same proxy) I get this:
https://registry.ollama.ai/v2/library/nomic-embed-text/manifests/137m-v1.5-fp16

{"schemaVersion":2,"mediaType":"application/vnd.docker.distribution.manifest.v2+json","config":{"digest":"sha256:31df23ea7daa448f9ccdbbcecce6c14689c8552222b80defd3830707c0139d4f","mediaType":"application/vnd.docker.container.image.v1+json","size":420},"layers":[{"digest":"sha256:970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6","mediaType":"application/vnd.ollama.image.model","size":274290656},{"digest":"sha256:c71d239df91726fc519c6eb72d318ec65820627232b2f796219e87dcf35d0ab4","mediaType":"application/vnd.ollama.image.license","size":11357},{"digest":"sha256:ce4a164fc04605703b485251fe9f1a181688ba0eb6badb80cc6335c0de17ca0d","mediaType":"application/vnd.ollama.image.params","size":17}]}

I updated Ollama this morning to 0.2.7, the script worked well.

Did I miss something?

Thanks

JC

OS

Linux

GPU

No response

CPU

Intel

Ollama version

0.2.7

Originally created by @jcpraud on GitHub (Jul 22, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5844 ### What is the issue? Hi, Since this morning I get a Connection refused error when trying to pull models: ollama pull nomic-embed-text:137m-v1.5-fp16 pulling manifest Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/nomic-embed-text/manifests/137m-v1.5-fp16": dial tcp 172.67.182.229:443: connect: connection refused I'm behind a proxy which worked well last week (http_proxy and https_proxy env variables are correct) With my browser (behind the same proxy) I get this: https://[registry.ollama.ai/v2/library/nomic-embed-text/manifests/137m-v1.5-fp16](https://registry.ollama.ai/v2/library/nomic-embed-text/manifests/137m-v1.5-fp16) {"schemaVersion":2,"mediaType":"application/vnd.docker.distribution.manifest.v2+json","config":{"digest":"sha256:31df23ea7daa448f9ccdbbcecce6c14689c8552222b80defd3830707c0139d4f","mediaType":"application/vnd.docker.container.image.v1+json","size":420},"layers":[{"digest":"sha256:970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6","mediaType":"application/vnd.ollama.image.model","size":274290656},{"digest":"sha256:c71d239df91726fc519c6eb72d318ec65820627232b2f796219e87dcf35d0ab4","mediaType":"application/vnd.ollama.image.license","size":11357},{"digest":"sha256:ce4a164fc04605703b485251fe9f1a181688ba0eb6badb80cc6335c0de17ca0d","mediaType":"application/vnd.ollama.image.params","size":17}]} I updated Ollama this morning to 0.2.7, the script worked well. Did I miss something? Thanks JC ### OS Linux ### GPU _No response_ ### CPU Intel ### Ollama version 0.2.7
GiteaMirror added the bug label 2026-04-12 14:25:45 -05:00
Author
Owner

@jcpraud commented on GitHub (Jul 23, 2024):

Hi,

I upgrade to Ollama 0.2.8, it did not change.

What is weird is that with curl from the same machine, it seems to work:

`curl "https://registry.ollama.ai/v2/library/nomic-embed-text/manifests/137m-v1.5-fp16"

`{"schemaVersion":2,"mediaType":"application/vnd.docker.distribution.manifest.v2+json","config":{"digest":"sha256:31df23ea7daa448f9ccdbbcecce6c14689c8552222b80defd3830707c0139d4f","mediaType":"application/vnd.docker.container.image.v1+json","size":420},"layers":[{"digest":"sha256:970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6","mediaType":"application/vnd.ollama.image.model","size":274290656},{"digest":"sha256:c71d239df91726fc519c6eb72d318ec65820627232b2f796219e87dcf35d0ab4","mediaType":"application/vnd.ollama.image.license","size":11357},{"digest":"sha256:ce4a164fc04605703b485251fe9f1a181688ba0eb6badb80cc6335c0de17ca0d","mediaType":"application/vnd.ollama.image.params","size":17}]}

<!-- gh-comment-id:2244540709 --> @jcpraud commented on GitHub (Jul 23, 2024): Hi, I upgrade to Ollama 0.2.8, it did not change. What is weird is that with curl from the same machine, it seems to work: `curl "https://registry.ollama.ai/v2/library/nomic-embed-text/manifests/137m-v1.5-fp16" `{"schemaVersion":2,"mediaType":"application/vnd.docker.distribution.manifest.v2+json","config":{"digest":"sha256:31df23ea7daa448f9ccdbbcecce6c14689c8552222b80defd3830707c0139d4f","mediaType":"application/vnd.docker.container.image.v1+json","size":420},"layers":[{"digest":"sha256:970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6","mediaType":"application/vnd.ollama.image.model","size":274290656},{"digest":"sha256:c71d239df91726fc519c6eb72d318ec65820627232b2f796219e87dcf35d0ab4","mediaType":"application/vnd.ollama.image.license","size":11357},{"digest":"sha256:ce4a164fc04605703b485251fe9f1a181688ba0eb6badb80cc6335c0de17ca0d","mediaType":"application/vnd.ollama.image.params","size":17}]}
Author
Owner

@jcpraud commented on GitHub (Jul 23, 2024):

OK. I finally found out: for an unknown reason I had to add http/s_proxy env variables to the /etc/systemd/system/ollama.service file. It did work without this last week.

<!-- gh-comment-id:2244571207 --> @jcpraud commented on GitHub (Jul 23, 2024): OK. I finally found out: for an unknown reason I had to add http/s_proxy env variables to the /etc/systemd/system/ollama.service file. It did work without this last week.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3645