[GH-ISSUE #2402] Error dial tcp: lookup no such host #1398

Closed
opened 2026-04-12 11:13:39 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @casey-martin on GitHub (Feb 8, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2402

I am encountering a dial tcp lookup error when executing any ollama pull or ollama run commands through docker on Ubuntu 22.04. I searched through the issues and found some similar errors, however they were related to the users' proxies which I am not using. I am also not running any firewalls. The commands I executed are as follows:

$ sudo docker pull ollama/ollama
Using default tag: latest
latest: Pulling from ollama/ollama
Digest: sha256:36ce80dc7609fe79711d261f6614a611f7ce200dcd2849367e49812fd4181e67
Status: Image is up to date for ollama/ollama:latest
docker.io/ollama/ollama:latest

$ sudo docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

$ sudo docker ps -a
CONTAINER ID   IMAGE           COMMAND               CREATED             STATUS             PORTS                                           NAMES
687b609d95bf   ollama/ollama   "/bin/ollama serve"   About an hour ago   Up About an hour   0.0.0.0:11434->11434/tcp, :::11434->11434/tcp   ollama

$ sudo docker exec -it ollama ollama run llama2
Error: Head "https://registry.ollama.ai/v2/library/llama2/blobs/sha256:8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246": dial tcp: lookup registry.ollama.ai on 192.168.0.1:53: no such host

Do you have any suggestions for resolving this error?

Originally created by @casey-martin on GitHub (Feb 8, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2402 I am encountering a `dial tcp lookup` error when executing any `ollama pull` or `ollama run` commands through docker on Ubuntu 22.04. I searched through the issues and found some similar errors, however they were related to the users' proxies which I am not using. I am also not running any firewalls. The commands I executed are as follows: ```bash $ sudo docker pull ollama/ollama Using default tag: latest latest: Pulling from ollama/ollama Digest: sha256:36ce80dc7609fe79711d261f6614a611f7ce200dcd2849367e49812fd4181e67 Status: Image is up to date for ollama/ollama:latest docker.io/ollama/ollama:latest $ sudo docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama $ sudo docker ps -a CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 687b609d95bf ollama/ollama "/bin/ollama serve" About an hour ago Up About an hour 0.0.0.0:11434->11434/tcp, :::11434->11434/tcp ollama $ sudo docker exec -it ollama ollama run llama2 Error: Head "https://registry.ollama.ai/v2/library/llama2/blobs/sha256:8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246": dial tcp: lookup registry.ollama.ai on 192.168.0.1:53: no such host ``` Do you have any suggestions for resolving this error?
Author
Owner

@c10l commented on GitHub (Feb 20, 2024):

I was just having that. Not sure what the actual problem was but restarting the Ollama service helped.

<!-- gh-comment-id:1954270659 --> @c10l commented on GitHub (Feb 20, 2024): I was just having that. Not sure what the actual problem was but restarting the Ollama service helped.
Author
Owner

@casey-martin commented on GitHub (Feb 20, 2024):

I found out it was due to my ISP. I have atrocious internet speeds, and I suspect the server which hosts the model weights will terminate the connection if there are latency/bandwidth issues with the client.

For me, if I spammed the command ollama pull model over and over again, eventually, a temporary connection could be made with the server to download the model weights. That said, the spotty connection would still cause the server to drop the connection mid-downlaod, but once the manifest was pulled, it was able to pick up where the download left off.

I will go ahead and close the issue as I found that the issue is on my (ISP's) end. The joys of functional monopolies.

<!-- gh-comment-id:1955163124 --> @casey-martin commented on GitHub (Feb 20, 2024): I found out it was due to my ISP. I have atrocious internet speeds, and I suspect the server which hosts the model weights will terminate the connection if there are latency/bandwidth issues with the client. For me, if I spammed the command `ollama pull model` over and over again, eventually, a temporary connection could be made with the server to download the model weights. That said, the spotty connection would still cause the server to drop the connection mid-downlaod, but once the manifest was pulled, it was able to pick up where the download left off. I will go ahead and close the issue as I found that the issue is on my (ISP's) end. The joys of functional monopolies.
Author
Owner

@Viral-Gajera commented on GitHub (May 2, 2024):

I was having the same issue while pulling the model, but turning on the Cloudflare worked for me

image

image

<!-- gh-comment-id:2090611813 --> @Viral-Gajera commented on GitHub (May 2, 2024): I was having the same issue while pulling the model, but turning `on` the Cloudflare worked for me ![image](https://github.com/ollama/ollama/assets/85317631/6f5004d2-33b4-494d-b3ca-6d6aab81f59e) ![image](https://github.com/ollama/ollama/assets/85317631/e6e956af-e51c-4bdb-92f3-8225ca7d4039)
Author
Owner

@Adarshsingh7 commented on GitHub (May 22, 2024):

@Viral-Gajera what does it mean by turning on cloudifare... how to do it so...

<!-- gh-comment-id:2125290848 --> @Adarshsingh7 commented on GitHub (May 22, 2024): @Viral-Gajera what does it mean by turning on cloudifare... how to do it so...
Author
Owner

@SrivatsaSJoshi commented on GitHub (May 31, 2024):

@Adarshsingh7 go to 1.1.1.1 and install cloudflare warp for whichever platform you use

<!-- gh-comment-id:2141372166 --> @SrivatsaSJoshi commented on GitHub (May 31, 2024): @Adarshsingh7 go to 1.1.1.1 and install cloudflare warp for whichever platform you use
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1398