[GH-ISSUE #7495] mac Errors when running #4768

Closed
opened 2026-04-12 15:42:30 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @shan23chen on GitHub (Nov 4, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7495

What is the issue?

ollama run gemma2:2b
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/gemma2/manifests/2b": write tcp [2601:19b:0:b8a0:915f:c8c:3de4:9c5]:50022->[2606:4700:3034::ac43:b6e5]:443: write: socket is not connected

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

ollama version is 0.3.14

Originally created by @shan23chen on GitHub (Nov 4, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7495 ### What is the issue? `ollama run gemma2:2b` pulling manifest Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/gemma2/manifests/2b": write tcp [2601:19b:0:b8a0:915f:c8c:3de4:9c5]:50022->[2606:4700:3034::ac43:b6e5]:443: write: socket is not connected ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version ollama version is 0.3.14
GiteaMirror added the needs more infobug labels 2026-04-12 15:42:30 -05:00
Author
Owner

@rick-github commented on GitHub (Nov 4, 2024):

What does the following command return:

curl -D - https://registry.ollama.ai/v2/library/gemma2/manifests/2b
<!-- gh-comment-id:2455623086 --> @rick-github commented on GitHub (Nov 4, 2024): What does the following command return: ``` curl -D - https://registry.ollama.ai/v2/library/gemma2/manifests/2b ```
Author
Owner

@shan23chen commented on GitHub (Nov 5, 2024):

Thanks for helping!

HTTP/2 200
date: Tue, 05 Nov 2024 15:48:29 GMT
content-type: text/plain; charset=utf-8
content-length: 857
via: 1.1 google
alt-svc: h3=":443"; ma=86400
cf-cache-status: DYNAMIC
report-to: {"endpoints":[{"url":"https://a.nel.cloudflare.com/report/v4?s=G1eSPycfrs2M1h2X2E%2FUvuRnE69QCWeSy%2B56YL9pp%2FNMPfQVMzICFcTItZwGtjEjrKGKuejatBqQC8EGPwVtxrhMw%2FvVfPwn0boLLsMq%2B7clHteK%2FT%2BPdwkYbnEa2PgW%2B4h5iS0%3D"}],"group":"cf-nel","max_age":604800}
nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800}
server: cloudflare
cf-ray: 8dddfd008b1b8c7d-EWR
server-timing: cfL4;desc="?proto=TCP&rtt=7015&sent=7&recv=11&lost=0&retrans=0&sent_bytes=3381&recv_bytes=801&delivery_rate=577052&cwnd=253&unsent_bytes=0&cid=4c3a34bcc87723c1&ts=293&x=0"

{"schemaVersion":2,"mediaType":"application/vnd.docker.distribution.manifest.v2+json","config":{"digest":"sha256:e18ad7af7efbfaecd8525e356861b84c240ece3a3effeb79d2aa7c0f258f71bd","mediaType":"application/vnd.docker.container.image.v1+json","size":487},"layers":[{"digest":"sha256:7462734796d67c40ecec2ca98eddf970e171dbb6b370e43fd633ee75b69abe1b","mediaType":"application/vnd.ollama.image.model","size":1629509152},{"digest":"sha256:e0a42594d802e5d31cdc786deb4823edb8adff66094d49de8fffe976d753e348","mediaType":"application/vnd.ollama.image.template","size":358},{"digest":"sha256:097a36493f718248845233af1d3fefe7a303f864fae13bc31a3a9704229378ca","mediaType":"application/vnd.ollama.image.license","size":8433},{"digest":"sha256:2490e7468436707d5156d7959cf3c6341cc46ee323084cfa3fcf30fe76e397dc","mediaType":"application/vnd.ollama.image.params","size":65}]}

<!-- gh-comment-id:2457540429 --> @shan23chen commented on GitHub (Nov 5, 2024): Thanks for helping! HTTP/2 200 date: Tue, 05 Nov 2024 15:48:29 GMT content-type: text/plain; charset=utf-8 content-length: 857 via: 1.1 google alt-svc: h3=":443"; ma=86400 cf-cache-status: DYNAMIC report-to: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v4?s=G1eSPycfrs2M1h2X2E%2FUvuRnE69QCWeSy%2B56YL9pp%2FNMPfQVMzICFcTItZwGtjEjrKGKuejatBqQC8EGPwVtxrhMw%2FvVfPwn0boLLsMq%2B7clHteK%2FT%2BPdwkYbnEa2PgW%2B4h5iS0%3D"}],"group":"cf-nel","max_age":604800} nel: {"success_fraction":0,"report_to":"cf-nel","max_age":604800} server: cloudflare cf-ray: 8dddfd008b1b8c7d-EWR server-timing: cfL4;desc="?proto=TCP&rtt=7015&sent=7&recv=11&lost=0&retrans=0&sent_bytes=3381&recv_bytes=801&delivery_rate=577052&cwnd=253&unsent_bytes=0&cid=4c3a34bcc87723c1&ts=293&x=0" {"schemaVersion":2,"mediaType":"application/vnd.docker.distribution.manifest.v2+json","config":{"digest":"sha256:e18ad7af7efbfaecd8525e356861b84c240ece3a3effeb79d2aa7c0f258f71bd","mediaType":"application/vnd.docker.container.image.v1+json","size":487},"layers":[{"digest":"sha256:7462734796d67c40ecec2ca98eddf970e171dbb6b370e43fd633ee75b69abe1b","mediaType":"application/vnd.ollama.image.model","size":1629509152},{"digest":"sha256:e0a42594d802e5d31cdc786deb4823edb8adff66094d49de8fffe976d753e348","mediaType":"application/vnd.ollama.image.template","size":358},{"digest":"sha256:097a36493f718248845233af1d3fefe7a303f864fae13bc31a3a9704229378ca","mediaType":"application/vnd.ollama.image.license","size":8433},{"digest":"sha256:2490e7468436707d5156d7959cf3c6341cc46ee323084cfa3fcf30fe76e397dc","mediaType":"application/vnd.ollama.image.params","size":65}]}
Author
Owner

@rick-github commented on GitHub (Nov 5, 2024):

So the manifest fetch works from the command line. Does ollama run gemma2:2b still fail? If it does, can you include server logs in your response?

<!-- gh-comment-id:2457550370 --> @rick-github commented on GitHub (Nov 5, 2024): So the manifest fetch works from the command line. Does `ollama run gemma2:2b` still fail? If it does, can you include [server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) in your response?
Author
Owner

@rick-github commented on GitHub (Nov 5, 2024):

Some other commands to try to get some more debugging info:

curl --no-proxy "*" -D - https://registry.ollama.ai/v2/library/gemma2/manifests/2b
curl -4 -D - https://registry.ollama.ai/v2/library/gemma2/manifests/2b
curl -6 -D - https://registry.ollama.ai/v2/library/gemma2/manifests/2b
<!-- gh-comment-id:2457581622 --> @rick-github commented on GitHub (Nov 5, 2024): Some other commands to try to get some more debugging info: ``` curl --no-proxy "*" -D - https://registry.ollama.ai/v2/library/gemma2/manifests/2b curl -4 -D - https://registry.ollama.ai/v2/library/gemma2/manifests/2b curl -6 -D - https://registry.ollama.ai/v2/library/gemma2/manifests/2b ```
Author
Owner

@varenc commented on GitHub (Nov 21, 2024):

This seems to related to a problem I was also experiencing: https://github.com/ollama/ollama/issues/4976#issuecomment-2492515648 (some server logs included)

I found that if I ran the ollama server from ollama serve instead of the top bar GUI app it worked fine.

@rick-github all those curl commands succed for me. Though I think you meant curl --noproxy and not curl --no-proxy

<!-- gh-comment-id:2492524569 --> @varenc commented on GitHub (Nov 21, 2024): This seems to related to a problem I was also experiencing: https://github.com/ollama/ollama/issues/4976#issuecomment-2492515648 (some server logs included) I found that if I ran the ollama server from `ollama serve` instead of the top bar GUI app it worked fine. @rick-github all those curl commands succed for me. Though I think you meant `curl --noproxy` and not `curl --no-proxy`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4768