[GH-ISSUE #1859] Pull model menifest connect timed out #63098

Closed
opened 2026-05-03 11:59:12 -05:00 by GiteaMirror · 24 comments
Owner

Originally created by @shivrajjadhav733 on GitHub (Jan 8, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/1859

OS - Apple M1 Pro chip

I tried to install ollama on machine. Installation was successful. I can see Ollama icon in menu bar at the top.

when I try to run model using command -

ollama run laama2
Or
ollama run mistral

I get attached error of operation timed out.
01037D88-D7A1-42C5-8702-7EAF41621293

I tried to run - brew services restart ollama and I got error saying “ Error: Formula ‘ollama’ is not installed.

How do I fix the errors and run models using ollama?

Originally created by @shivrajjadhav733 on GitHub (Jan 8, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/1859 OS - Apple M1 Pro chip I tried to install ollama on machine. Installation was successful. I can see Ollama icon in menu bar at the top. when I try to run model using command - ollama run laama2 Or ollama run mistral I get attached error of operation timed out. ![01037D88-D7A1-42C5-8702-7EAF41621293](https://github.com/jmorganca/ollama/assets/35407279/d53d10f4-6d1a-451e-a851-7ca3887b1939) I tried to run - brew services restart ollama and I got error saying “ Error: Formula ‘ollama’ is not installed. How do I fix the errors and run models using ollama?
Author
Owner

@pdevine commented on GitHub (Jan 8, 2024):

@shivrajjadhav733 are you behind some kind of firewall? Can you ping registry.ollama.ai? It looks like DNS resolved correctly.

<!-- gh-comment-id:1881650058 --> @pdevine commented on GitHub (Jan 8, 2024): @shivrajjadhav733 are you behind some kind of firewall? Can you `ping registry.ollama.ai`? It looks like DNS resolved correctly.
Author
Owner

@shivrajjadhav733 commented on GitHub (Jan 8, 2024):

I am behind firewall and don’t route ICMP to internet. So ping won’t work. However I tried to use wget registry.ollama.ai and it worked.

However wget for manifest doesn’t work.
please see screenshot.

D1B4E4F0-56D6-459F-8438-A50F1E9AD8B7

<!-- gh-comment-id:1881946440 --> @shivrajjadhav733 commented on GitHub (Jan 8, 2024): I am behind firewall and don’t route ICMP to internet. So ping won’t work. However I tried to use wget registry.ollama.ai and it worked. However wget for manifest doesn’t work. please see screenshot. ![D1B4E4F0-56D6-459F-8438-A50F1E9AD8B7](https://github.com/jmorganca/ollama/assets/35407279/2f17818d-37f0-4d98-8330-8be855b0cd33)
Author
Owner

@pdevine commented on GitHub (Jan 8, 2024):

The bad request happens because you're not setting the headers correctly for the registry to understand. That's expected behaviour.

To get this to work behind a proxy, you can runHTTPS_PROXY=<my proxy> ollama serve when starting ollama (you should exit the icon at the top and start it yourself manually). You'll need to make sure that the proxy's certs are installed correctly on your system as well.

There's some more info in the FAQ: https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy

<!-- gh-comment-id:1882001440 --> @pdevine commented on GitHub (Jan 8, 2024): The `bad request` happens because you're not setting the headers correctly for the registry to understand. That's expected behaviour. To get this to work behind a proxy, you can run`HTTPS_PROXY=<my proxy> ollama serve` when starting ollama (you should exit the icon at the top and start it yourself manually). You'll need to make sure that the proxy's certs are installed correctly on your system as well. There's some more info in the FAQ: https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy
Author
Owner

@shivrajjadhav733 commented on GitHub (Jan 9, 2024):

  1. I went to menu bar and clicked “Ollama quit”
  2. Please see screenshot of ollama serve before and after step 1 is executed.

44CFBEEB-DA88-433F-B922-3884C9A006C6

  1. Then I ran command - HTTPS_PROXY= ollama serve
  2. Then I went to Applications and ran Ollama manually.
  3. please see screenshot-2 which shows before and after of step 4.
    84EF6BC1-C187-4543-BCD6-AEB96F34AD55

Even after this I still see the same error as explained earlier - network is unreachable.

<!-- gh-comment-id:1882096217 --> @shivrajjadhav733 commented on GitHub (Jan 9, 2024): 1. I went to menu bar and clicked “Ollama quit” 2. Please see screenshot of ollama serve before and after step 1 is executed. ![44CFBEEB-DA88-433F-B922-3884C9A006C6](https://github.com/jmorganca/ollama/assets/35407279/7f9de084-e838-4af4-8122-ea5c94cf9821) 3. Then I ran command - HTTPS_PROXY=<my proxy> ollama serve 4. Then I went to Applications and ran Ollama manually. 5. please see screenshot-2 which shows before and after of step 4. ![84EF6BC1-C187-4543-BCD6-AEB96F34AD55](https://github.com/jmorganca/ollama/assets/35407279/4f27b4e2-52d4-40e2-94ad-f436bc7354be) Even after this I still see the same error as explained earlier - network is unreachable.
Author
Owner

@pdevine commented on GitHub (Jan 9, 2024):

@shivrajjadhav733 it looks like you're using an http proxy and not an https proxy with the HTTPS_PROXY env variable.

<!-- gh-comment-id:1882325910 --> @pdevine commented on GitHub (Jan 9, 2024): @shivrajjadhav733 it looks like you're using an `http` proxy and not an `https` proxy with the `HTTPS_PROXY` env variable.
Author
Owner

@jingyibo123 commented on GitHub (Jan 9, 2024):

Having the same issue pulling in an environment similar to @shivrajjadhav733,
Normally (from previous experiences) it's due to a self-signed SSL certificate, but ollama only gives connection timed out so I can't know exatly whether its that or the request is blocked by the firewall.

<!-- gh-comment-id:1882993504 --> @jingyibo123 commented on GitHub (Jan 9, 2024): Having the same issue pulling in an environment similar to @shivrajjadhav733, Normally (from previous experiences) it's due to a self-signed SSL certificate, but ollama only gives `connection timed out` so I can't know exatly whether its that or the request is blocked by the firewall.
Author
Owner

@shivrajjadhav733 commented on GitHub (Jan 9, 2024):

@pdevine yes for HTTPS_PROXY env variable points to correct location.

I even tried to run command by explicitly passing proxy like this-

42B78421-E07A-4153-9986-C999888951B9

and still I see connection timeout error.

My suspicion is - ollama run is not able to read environment variable to connect to internet using proxy to do the pull manifest.
It seems bug in ollama.

<!-- gh-comment-id:1883369312 --> @shivrajjadhav733 commented on GitHub (Jan 9, 2024): @pdevine yes for HTTPS_PROXY env variable points to correct location. I even tried to run command by explicitly passing proxy like this- ![42B78421-E07A-4153-9986-C999888951B9](https://github.com/jmorganca/ollama/assets/35407279/71a27ea6-aa20-4838-9a4a-1ba095f6b96a) and still I see connection timeout error. My suspicion is - ollama run is not able to read environment variable to connect to internet using proxy to do the pull manifest. It seems bug in ollama.
Author
Owner

@byjrack commented on GitHub (Jan 9, 2024):

Gut says that https://github.com/jmorganca/ollama/blob/main/server/download.go doesn't respect the proxy, but still checking. So the client might be fine, but having the server pull a model from the registry doesn't quite function.

<!-- gh-comment-id:1883449534 --> @byjrack commented on GitHub (Jan 9, 2024): Gut says that https://github.com/jmorganca/ollama/blob/main/server/download.go doesn't respect the proxy, but still checking. So the client might be fine, but having the server pull a model from the registry doesn't quite function.
Author
Owner

@butterl commented on GitHub (Jan 10, 2024):

also have this issue in ubuntu

<!-- gh-comment-id:1884298596 --> @butterl commented on GitHub (Jan 10, 2024): also have this issue in ubuntu
Author
Owner

@shivrajjadhav733 commented on GitHub (Jan 12, 2024):

@pdevine any thoughts or suggestions on how to proceed with the fix?

<!-- gh-comment-id:1888385849 --> @shivrajjadhav733 commented on GitHub (Jan 12, 2024): @pdevine any thoughts or suggestions on how to proceed with the fix?
Author
Owner

@Peng-Lei commented on GitHub (Jan 30, 2024):

Ubuntu:
If you follow the steps below, the same error will be reproduced:
1:login ubuntu with user xxx(sudoer)
2:set http_proxy and https_proxy in ~/.bashrc (not global)
3:systemctl restart ollama
4:ollama pull llama2:70b or ollama pull llama2:70b --insecure
it failed:

pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/70b": dial tcp 34.120.132.20:443: connect: connection timed out

but wget registry.ollama.ai will be success.

My solution
1:login ubuntu with user xxx(sudoer)
2:set http_proxy and https_proxy in ~/.bashrc (not global)
3:ollama serve(without sudo)
4:ollama pull llama2:70b
It run well.

<!-- gh-comment-id:1915986623 --> @Peng-Lei commented on GitHub (Jan 30, 2024): Ubuntu: If you follow the steps below, the same error will be reproduced: 1:login ubuntu with user xxx(sudoer) 2:set http_proxy and https_proxy in ~/.bashrc (not global) 3:systemctl restart ollama 4:ollama pull llama2:70b or ollama pull llama2:70b --insecure it failed: ``` pulling manifest Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/70b": dial tcp 34.120.132.20:443: connect: connection timed out ``` but ```wget registry.ollama.ai``` will be success. My solution 1:login ubuntu with user xxx(sudoer) 2:set http_proxy and https_proxy in ~/.bashrc (not global) **3:ollama serve(without sudo)** 4:ollama pull llama2:70b It run well.
Author
Owner

@Peng-Lei commented on GitHub (Jan 30, 2024):

If ollama is run as a systemd service, it is started by user 'ollama' by default. So we should ensure that the proxy is effective for all users

<!-- gh-comment-id:1916254961 --> @Peng-Lei commented on GitHub (Jan 30, 2024): If ollama is run as a systemd service, it is started by user 'ollama' by default. So we should ensure that the proxy is effective for all users
Author
Owner

@Dorish commented on GitHub (Mar 6, 2024):

Met this issue as well in ubuntu, any updates?

<!-- gh-comment-id:1980917161 --> @Dorish commented on GitHub (Mar 6, 2024): Met this issue as well in ubuntu, any updates?
Author
Owner

@pdevine commented on GitHub (Mar 11, 2024):

This is actually somewhat covered in the FAQ. The short answer if you want to only configure this for ollama is:

$ sudo systemctl edit ollama

You'll need to add something like:

[Service]
Environment="https_proxy=x.x.x.x"
Environment="http_proxy=y.y.y.y"

Alternatively if you want to set this system wide for you host try these instructions (from Stack Exchange).

I'm going to go ahead and close the issue, but you can keep commenting or feel free to reopen it if you feel I've missed something.

<!-- gh-comment-id:1989375455 --> @pdevine commented on GitHub (Mar 11, 2024): This is actually _somewhat_ covered in the [FAQ](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux). The short answer if you want to only configure this for ollama is: ``` $ sudo systemctl edit ollama ``` You'll need to add something like: ``` [Service] Environment="https_proxy=x.x.x.x" Environment="http_proxy=y.y.y.y" ``` Alternatively if you want to set this system wide for you host try [these instructions](https://unix.stackexchange.com/questions/213737/how-do-i-set-systemwide-connection-over-a-proxy-server) (from Stack Exchange). I'm going to go ahead and close the issue, but you can keep commenting or feel free to reopen it if you feel I've missed something.
Author
Owner

@dhandhalyabhavik commented on GitHub (Apr 10, 2024):

This is actually somewhat covered in the FAQ. The short answer if you want to only configure this for ollama is:

$ sudo systemctl edit ollama

You'll need to add something like:

[Service]
Environment="https_proxy=x.x.x.x"
Environment="http_proxy=y.y.y.y"

Alternatively if you want to set this system wide for you host try these instructions (from Stack Exchange).

I'm going to go ahead and close the issue, but you can keep commenting or feel free to reopen it if you feel I've missed something.

Adding more details here,
You can forever configure proxy settings here by adding http proxy config file as following,

# create file
touch /etc/systemd/system/ollama.service.d/http-proxy.conf

#add proxy settings 
bash
[Service]
Environment="https_proxy=x.x.x.x"
Environment="http_proxy=y.y.y.y"

After adding those lines, just restart required services,

systemctl daemon-reload

systemctl restart ollama
<!-- gh-comment-id:2048000134 --> @dhandhalyabhavik commented on GitHub (Apr 10, 2024): > This is actually _somewhat_ covered in the [FAQ](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux). The short answer if you want to only configure this for ollama is: > > ``` > $ sudo systemctl edit ollama > ``` > > You'll need to add something like: > > ``` > [Service] > Environment="https_proxy=x.x.x.x" > Environment="http_proxy=y.y.y.y" > ``` > > Alternatively if you want to set this system wide for you host try [these instructions](https://unix.stackexchange.com/questions/213737/how-do-i-set-systemwide-connection-over-a-proxy-server) (from Stack Exchange). > > I'm going to go ahead and close the issue, but you can keep commenting or feel free to reopen it if you feel I've missed something. Adding more details here, You can forever configure proxy settings here by adding http proxy config file as following, ```bash # create file touch /etc/systemd/system/ollama.service.d/http-proxy.conf #add proxy settings bash [Service] Environment="https_proxy=x.x.x.x" Environment="http_proxy=y.y.y.y" ``` After adding those lines, just restart required services, ```bash systemctl daemon-reload systemctl restart ollama ```
Author
Owner

@SleeplessBegonia commented on GitHub (May 27, 2024):

hello,i am wondering how should do i to solve this problem?
微信图片_20240527170649

<!-- gh-comment-id:2133028952 --> @SleeplessBegonia commented on GitHub (May 27, 2024): hello,i am wondering how should do i to solve this problem? <img width="1015" alt="微信图片_20240527170649" src="https://github.com/ollama/ollama/assets/115441356/4b3667ca-7500-497d-8c63-a930fee99ee5">
Author
Owner

@AmarkanthJinna commented on GitHub (May 29, 2024):

Hi, Please help me.

ollama run llama3

pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3/manifests/latest": dial tcp 104.21.75.227:443: I/o timed out

I'm behind corporate proxy

http://proxy.ebiz.corporate.com:80

Is this proxy issue? How can I add proxy here on my linux machine

<!-- gh-comment-id:2138150257 --> @AmarkanthJinna commented on GitHub (May 29, 2024): Hi, Please help me. ollama run llama3 pulling manifest Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3/manifests/latest": dial tcp 104.21.75.227:443: I/o timed out I'm behind corporate proxy http://proxy.ebiz.corporate.com:80 Is this proxy issue? How can I add proxy here on my linux machine
Author
Owner

@pdevine commented on GitHub (May 30, 2024):

@AmarkanthJinna did you try the instructions mentioned above? It's also possible that you'll need to get your company to open up the proxy.

<!-- gh-comment-id:2138757706 --> @pdevine commented on GitHub (May 30, 2024): @AmarkanthJinna did you try the instructions mentioned above? It's also possible that you'll need to get your company to open up the proxy.
Author
Owner

@FrancoisLasson commented on GitHub (May 31, 2024):

Solution is here : https://github.com/ollama/ollama/issues/729

<!-- gh-comment-id:2142073059 --> @FrancoisLasson commented on GitHub (May 31, 2024): Solution is here : https://github.com/ollama/ollama/issues/729
Author
Owner

@xxxpsyduck commented on GitHub (Jul 24, 2024):

The file is at /etc/systemd/system/ollama.service . Make sure the file exists and successful saved after you edited it

<!-- gh-comment-id:2247284754 --> @xxxpsyduck commented on GitHub (Jul 24, 2024): The file is at `/etc/systemd/system/ollama.service` . Make sure the file exists and successful saved after you edited it
Author
Owner

@ssghost commented on GitHub (Jul 24, 2024):

I have same problem on Macos with version 0.2.5, and set HTTPS_PROXY variable is helpless. My solution is to downgrade it from 0.2.5 to 0.2.3.

<!-- gh-comment-id:2247647212 --> @ssghost commented on GitHub (Jul 24, 2024): I have same problem on Macos with version 0.2.5, and set HTTPS_PROXY variable is helpless. My solution is to downgrade it from 0.2.5 to 0.2.3.
Author
Owner

@AmarkanthReddyJinna commented on GitHub (Jul 24, 2024):

Yes, I have added proxy details in etc/systemd/system/ollama.service.But still I got "Forbidden" issue when pulling manifest.
command I tried was: ollama run mistral

<!-- gh-comment-id:2247686739 --> @AmarkanthReddyJinna commented on GitHub (Jul 24, 2024): Yes, I have added proxy details in etc/systemd/system/ollama.service.But still I got "Forbidden" issue when pulling manifest. command I tried was: ollama run mistral
Author
Owner

@byjrack commented on GitHub (Jul 24, 2024):

So a http/403 tells me that you are getting a proxy connection, BUT that the proxy is disallowing you to hit the destination.

I would try using curl to the same proxy to https://registry.ollama.ai or https://r2.cloudflarestorage.com to see if your security team may have restricted either ollama as an AI product or the r2 service where the registry is hosted as a File Storage DLP risk.

<!-- gh-comment-id:2247740148 --> @byjrack commented on GitHub (Jul 24, 2024): So a http/403 tells me that you are getting a proxy connection, BUT that the proxy is disallowing you to hit the destination. I would try using curl to the same proxy to https://registry.ollama.ai or https://r2.cloudflarestorage.com to see if your security team may have restricted either ollama as an AI product or the r2 service where the registry is hosted as a File Storage DLP risk.
Author
Owner

@xxxpsyduck commented on GitHub (Jul 25, 2024):

Yes, I have added proxy details in etc/systemd/system/ollama.service.But still I got "Forbidden" issue when pulling manifest. command I tried was: ollama run mistral

did you run the following cmd after editing the file:

systemctl daemon-reload
systemctl restart ollama
<!-- gh-comment-id:2249297673 --> @xxxpsyduck commented on GitHub (Jul 25, 2024): > Yes, I have added proxy details in etc/systemd/system/ollama.service.But still I got "Forbidden" issue when pulling manifest. command I tried was: ollama run mistral did you run the following cmd after editing the file: ``` systemctl daemon-reload systemctl restart ollama ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#63098