[GH-ISSUE #12073] Destination unreachable: Source address failed ingress/egress policy #54533

Closed
opened 2026-04-29 06:16:18 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @DiogoHSS on GitHub (Aug 25, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12073

What is the issue?

This message is what I get when I try to do ping registry.ollama.ai.

It also makes the commands list, show after 30 seconds and give the error Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: i/o timeout

I downloaded it using the command on the website curl -fsSL https://ollama.com/install.sh | sh.

System:

Void Linux running on WSL2 on Windows 11

Relevant log output


OS

WSL2

GPU

AMD

CPU

AMD

Ollama version

client 0.11.6

Originally created by @DiogoHSS on GitHub (Aug 25, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12073 ### What is the issue? This message is what I get when I try to do `ping registry.ollama.ai`. It also makes the commands list, show after 30 seconds and give the error Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: i/o timeout I downloaded it using the command on the website `curl -fsSL https://ollama.com/install.sh | sh`. System: Void Linux running on WSL2 on Windows 11 ### Relevant log output ```shell ``` ### OS WSL2 ### GPU AMD ### CPU AMD ### Ollama version client 0.11.6
GiteaMirror added the question label 2026-04-29 06:16:18 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 25, 2025):

$ ping -c 3 registry.ollama.ai
PING registry.ollama.ai (172.67.182.229) 56(84) bytes of data.
64 bytes from 172.67.182.229: icmp_seq=1 ttl=57 time=15.3 ms
64 bytes from 172.67.182.229: icmp_seq=2 ttl=57 time=16.3 ms
64 bytes from 172.67.182.229: icmp_seq=3 ttl=57 time=14.2 ms

--- registry.ollama.ai ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 2003ms
rtt min/avg/max/mdev = 14.243/15.296/16.315/0.846 ms

Do you have a firewall in Void or Windows that prevents outbound connections?

<!-- gh-comment-id:3220796085 --> @rick-github commented on GitHub (Aug 25, 2025): ```console $ ping -c 3 registry.ollama.ai PING registry.ollama.ai (172.67.182.229) 56(84) bytes of data. 64 bytes from 172.67.182.229: icmp_seq=1 ttl=57 time=15.3 ms 64 bytes from 172.67.182.229: icmp_seq=2 ttl=57 time=16.3 ms 64 bytes from 172.67.182.229: icmp_seq=3 ttl=57 time=14.2 ms --- registry.ollama.ai ping statistics --- 3 packets transmitted, 3 received, 0% packet loss, time 2003ms rtt min/avg/max/mdev = 14.243/15.296/16.315/0.846 ms ``` Do you have a firewall in Void or Windows that prevents outbound connections?
Author
Owner

@pdevine commented on GitHub (Aug 25, 2025):

Maybe an issue w/ WSL2? Can you ping from Windows?

This issue isn't really related to ollama, so I'm going to go ahead and close it (thank you @rick-github !). You can feel free to keep commenting though.

<!-- gh-comment-id:3221894297 --> @pdevine commented on GitHub (Aug 25, 2025): Maybe an issue w/ WSL2? Can you ping from Windows? This issue isn't really related to ollama, so I'm going to go ahead and close it (thank you @rick-github !). You can feel free to keep commenting though.
Author
Owner

@DiogoHSS commented on GitHub (Aug 25, 2025):

Thank you for you guys help, I only saw the messages now that I got home and indeed it was a problem with my WSL2 configuration, I will describe below what I did:

I could indead ping from Windows.
I tried to ping or curl google.com from WSL2 and even that did not work.

Looks like I was using wsl on networkingMode = mirrored, I switch it to nat and restarted and ollama is running correcly now.

<!-- gh-comment-id:3222074988 --> @DiogoHSS commented on GitHub (Aug 25, 2025): Thank you for you guys help, I only saw the messages now that I got home and indeed it was a problem with my WSL2 configuration, I will describe below what I did: I could indead ping from Windows. I tried to ping or curl google.com from WSL2 and even that did not work. Looks like I was using wsl on networkingMode = mirrored, I switch it to nat and restarted and ollama is running correcly now.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#54533