[GH-ISSUE #1417] Cant pull model manifest #26517

Closed
opened 2026-04-22 02:49:58 -05:00 by GiteaMirror · 9 comments
Owner

Originally created by @bw-Deejee on GitHub (Dec 7, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1417

I just installed ollama on a Azure VM.
Running ollama run llama2 results in

pulling manifest ⠴ for a couple minutes and eventually:

Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": dial tcp 34.120.132.20:443: connect: connection timed out

Also visiting the link, results in this response:
{ "errors": [ { "code": "MANIFEST_INVALID", "message": "manifest invalid", "detail": {} } ] }

I've tried a lot of things seen in other issues, as im operating behind a proxy. But nothing seems to work even though my proxy works for other all other stuff. And the invalid Json response above leads me to believe that I might not be the problem. Please help

Originally created by @bw-Deejee on GitHub (Dec 7, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1417 I just installed ollama on a Azure VM. Running `ollama run llama2` results in `pulling manifest ⠴` for a couple minutes and eventually: Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": dial tcp 34.120.132.20:443: connect: connection timed out Also visiting the link, results in this response: `{ "errors": [ { "code": "MANIFEST_INVALID", "message": "manifest invalid", "detail": {} } ] }` I've tried a lot of things seen in other issues, as im operating behind a proxy. But nothing seems to work even though my proxy works for other all other stuff. And the invalid Json response above leads me to believe that I might not be the problem. Please help
Author
Owner

@BruceMacD commented on GitHub (Dec 7, 2023):

Hi @bw-Deejee, try giving these steps from the FAQ a try. Pulling from behind a proxy may cause issues.
https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy

<!-- gh-comment-id:1846046283 --> @BruceMacD commented on GitHub (Dec 7, 2023): Hi @bw-Deejee, try giving these steps from the FAQ a try. Pulling from behind a proxy may cause issues. https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy
Author
Owner

@bw-Deejee commented on GitHub (Dec 8, 2023):

I checked it out, but still can't get it to work.
The recommended line:
echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf
doesn't work, as there is no "ollama.service.d" in the system path yet.

I then created the directory myself and added my proxy adress as stated above (yes, i replaced the example with my actual proxy adress).

I still get the same timeout.

systemctl status ollama returns the following warning:

`● ollama.service - Ollama Service
Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled)
Drop-In: /etc/systemd/system/ollama.service.d
└─environment.conf
Active: active (running) since Fri 2023-12-08 11:18:05 UTC; 2s ago
Main PID: 2240 (ollama)
Tasks: 8 (limit: 67474)
Memory: 21.6M
CGroup: /system.slice/ollama.service
├─2240 /usr/local/bin/ollama serve
└─2247 nvidia-smi --query-gpu=memory.free --format=csv,noheader,nounits

Dec 08 11:18:05 xxx systemd[1]: Started Ollama Service.
Dec 08 11:18:05 xxx ollama[2240]: 2023/12/08 11:18:05 images.go:734: total blobs: 0
Dec 08 11:18:05 xxx ollama[2240]: 2023/12/08 11:18:05 images.go:741: total unused blobs removed: 0
Dec 08 11:18:05 xxx ollama[2240]: 2023/12/08 11:18:05 routes.go:787: Listening on 127.0.0.1:11434 (version 0.1.13)
Dec 08 11:18:07 xxx systemd[1]: /etc/systemd/system/ollama.service.d/environment.conf:1: Assignment outside of section. Ignoring.`

Also I want to reiterate my question earlier: Is it normal to get this "manifest invalid" json response when accesing https://registry.ollama.ai/v2/library/llama2/manifests/latest via browser?

<!-- gh-comment-id:1846880395 --> @bw-Deejee commented on GitHub (Dec 8, 2023): I checked it out, but still can't get it to work. The recommended line: `echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf` doesn't work, as there is no "ollama.service.d" in the system path yet. I then created the directory myself and added my proxy adress as stated above (yes, i replaced the example with my actual proxy adress). I still get the same timeout. `systemctl status ollama` returns the following warning: `● ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled) Drop-In: /etc/systemd/system/ollama.service.d └─environment.conf Active: active (running) since Fri 2023-12-08 11:18:05 UTC; 2s ago Main PID: 2240 (ollama) Tasks: 8 (limit: 67474) Memory: 21.6M CGroup: /system.slice/ollama.service ├─2240 /usr/local/bin/ollama serve └─2247 nvidia-smi --query-gpu=memory.free --format=csv,noheader,nounits Dec 08 11:18:05 xxx systemd[1]: Started Ollama Service. Dec 08 11:18:05 xxx ollama[2240]: 2023/12/08 11:18:05 images.go:734: total blobs: 0 Dec 08 11:18:05 xxx ollama[2240]: 2023/12/08 11:18:05 images.go:741: total unused blobs removed: 0 Dec 08 11:18:05 xxx ollama[2240]: 2023/12/08 11:18:05 routes.go:787: Listening on 127.0.0.1:11434 (version 0.1.13) Dec 08 11:18:07 xxx systemd[1]: /etc/systemd/system/ollama.service.d/environment.conf:1: Assignment outside of section. Ignoring.` Also I want to reiterate my question earlier: Is it normal to get this "manifest invalid" json response when accesing https://registry.ollama.ai/v2/library/llama2/manifests/latest via browser?
Author
Owner

@mxyng commented on GitHub (Dec 11, 2023):

The FAQ might be unclear but the ENVIRONMENT line is preceded by creating the directory and adding a section header. The complete step should be

$ mkdir -p /etc/systemd/system/ollama.service.d
$ echo '[Service]' >/etc/systemd/system/ollama.service.d/environment.conf
$ echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf

The [Service] header is missing so systemd is actually ignoring the environment configuration

<!-- gh-comment-id:1850989599 --> @mxyng commented on GitHub (Dec 11, 2023): The FAQ might be unclear but the ENVIRONMENT line is preceded by creating the directory and adding a section header. The complete step should be ``` $ mkdir -p /etc/systemd/system/ollama.service.d $ echo '[Service]' >/etc/systemd/system/ollama.service.d/environment.conf $ echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf ``` The `[Service]` header is missing so systemd is actually ignoring the environment configuration
Author
Owner

@bw-Deejee commented on GitHub (Dec 12, 2023):

Working now thanks to that. Thank you!

<!-- gh-comment-id:1852148162 --> @bw-Deejee commented on GitHub (Dec 12, 2023): Working now thanks to that. Thank you!
Author
Owner

@RockportTigger commented on GitHub (Apr 12, 2024):

The FAQ might be unclear but the ENVIRONMENT line is preceded by creating the directory and adding a section header. The complete step should be

$ mkdir -p /etc/systemd/system/ollama.service.d
$ echo '[Service]' >/etc/systemd/system/ollama.service.d/environment.conf
$ echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf

The [Service] header is missing so systemd is actually ignoring the environment configuration

Worked! Perfect! Thanks!

<!-- gh-comment-id:2051072033 --> @RockportTigger commented on GitHub (Apr 12, 2024): > The FAQ might be unclear but the ENVIRONMENT line is preceded by creating the directory and adding a section header. The complete step should be > > ``` > $ mkdir -p /etc/systemd/system/ollama.service.d > $ echo '[Service]' >/etc/systemd/system/ollama.service.d/environment.conf > $ echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf > ``` > > The `[Service]` header is missing so systemd is actually ignoring the environment configuration Worked! Perfect! Thanks!
Author
Owner

@datalee commented on GitHub (Apr 28, 2024):

The FAQ might be unclear but the ENVIRONMENT line is preceded by creating the directory and adding a section header. The complete step should be

$ mkdir -p /etc/systemd/system/ollama.service.d
$ echo '[Service]' >/etc/systemd/system/ollama.service.d/environment.conf
$ echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf

The [Service] header is missing so systemd is actually ignoring the environment configuration

Based on the above solution, I encountered the following issue:
Error: pull model manifest: Get https://registry.ollama.ai/v2/library/qwen/manifests/110b-chat-v1.5- q4_0 proxyconnect tcp: dial tcp: lookup http: no such host

<!-- gh-comment-id:2081309226 --> @datalee commented on GitHub (Apr 28, 2024): > The FAQ might be unclear but the ENVIRONMENT line is preceded by creating the directory and adding a section header. The complete step should be > > ``` > $ mkdir -p /etc/systemd/system/ollama.service.d > $ echo '[Service]' >/etc/systemd/system/ollama.service.d/environment.conf > $ echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf > ``` > > The `[Service]` header is missing so systemd is actually ignoring the environment configuration Based on the above solution, I encountered the following issue: `Error: pull model manifest: Get https://registry.ollama.ai/v2/library/qwen/manifests/110b-chat-v1.5- q4_0 proxyconnect tcp: dial tcp: lookup http: no such host`
Author
Owner

@SleeplessBegonia commented on GitHub (May 27, 2024):

I just installed ollama on a Azure VM. Running ollama run llama2 results in

pulling manifest ⠴ for a couple minutes and eventually:

Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": dial tcp 34.120.132.20:443: connect: connection timed out

Also visiting the link, results in this response: { "errors": [ { "code": "MANIFEST_INVALID", "message": "manifest invalid", "detail": {} } ] }

I've tried a lot of things seen in other issues, as im operating behind a proxy. But nothing seems to work even though my proxy works for other all other stuff. And the invalid Json response above leads me to believe that I might not be the problem. Please help

how did you solve that?

<!-- gh-comment-id:2133030961 --> @SleeplessBegonia commented on GitHub (May 27, 2024): > I just installed ollama on a Azure VM. Running `ollama run llama2` results in > > `pulling manifest ⠴` for a couple minutes and eventually: > > Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": dial tcp 34.120.132.20:443: connect: connection timed out > > Also visiting the link, results in this response: `{ "errors": [ { "code": "MANIFEST_INVALID", "message": "manifest invalid", "detail": {} } ] }` > > I've tried a lot of things seen in other issues, as im operating behind a proxy. But nothing seems to work even though my proxy works for other all other stuff. And the invalid Json response above leads me to believe that I might not be the problem. Please help how did you solve that?
Author
Owner

@SleeplessBegonia commented on GitHub (May 27, 2024):

The FAQ might be unclear but the ENVIRONMENT line is preceded by creating the directory and adding a section header. The complete step should be

$ mkdir -p /etc/systemd/system/ollama.service.d
$ echo '[Service]' >/etc/systemd/system/ollama.service.d/environment.conf
$ echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf

The [Service] header is missing so systemd is actually ignoring the environment configuration

Based on the above solution, I encountered the following issue: Error: pull model manifest: Get https://registry.ollama.ai/v2/library/qwen/manifests/110b-chat-v1.5- q4_0 proxyconnect tcp: dial tcp: lookup http: no such host

hey,did you solve that?

<!-- gh-comment-id:2133033280 --> @SleeplessBegonia commented on GitHub (May 27, 2024): > > The FAQ might be unclear but the ENVIRONMENT line is preceded by creating the directory and adding a section header. The complete step should be > > ``` > > $ mkdir -p /etc/systemd/system/ollama.service.d > > $ echo '[Service]' >/etc/systemd/system/ollama.service.d/environment.conf > > $ echo 'Environment="HTTPS_PROXY=https://proxy.example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf > > ``` > > > > > > > > > > > > > > > > > > > > > > > > The `[Service]` header is missing so systemd is actually ignoring the environment configuration > > Based on the above solution, I encountered the following issue: `Error: pull model manifest: Get https://registry.ollama.ai/v2/library/qwen/manifests/110b-chat-v1.5- q4_0 proxyconnect tcp: dial tcp: lookup http: no such host` hey,did you solve that?
Author
Owner

@rumitsa commented on GitHub (Dec 7, 2024):

In my case the issue was related with bad internet connection. After getting stable connect, pulling the manifest started to work.

<!-- gh-comment-id:2525207741 --> @rumitsa commented on GitHub (Dec 7, 2024): In my case the issue was related with bad internet connection. After getting stable connect, pulling the manifest started to work.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26517