[GH-ISSUE #769] Provide script to pull model manifest and files with curl #46878

Closed
opened 2026-04-28 01:25:16 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @ctsrc on GitHub (Oct 12, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/769

Hi, because my computer is behind a http proxy and I don't manage to make ollama pull via the proxy I would like to manually pull the files I need using curl

First, if I try with ollama itself to pull for example codellama:34b-code from https://ollama.ai/library/codellama/tags

ollama pull codellama:34b-code

which doesn't work for me because of the http proxy but it says where to get the manifest

Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/codellama/manifests/34b-code": dial tcp: lookup registry.ollama.ai: no such host

But then if I try to retrieve with curl (which I've configured to be aware of the http proxy) using the url mentioned in the output

curl https://registry.ollama.ai/v2/library/codellama/manifests/34b-code

I get this error:

{"errors":[{"code":"MANIFEST_INVALID","message":"manifest invalid","detail":{}}]}

I would like that a small shell script could be included with ollama, that will take the name of a model to pull and then uses curl to pull the manifest and the model files, so that it is possible to pull via http proxy. The script only needs to use curl, and does not need to be written to account for http proxy. Local configuration of curl will apply. So in theory it should be pretty straight forward to write such a script for anyone that knows the correct URL to pull manifest from etc. (And for extracting any data from json responses, jq can be used in the script.)

Originally created by @ctsrc on GitHub (Oct 12, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/769 Hi, because my computer is behind a http proxy and I don't manage to make ollama pull via the proxy I would like to manually pull the files I need using curl First, if I try with ollama itself to pull for example codellama:34b-code from https://ollama.ai/library/codellama/tags ```zsh ollama pull codellama:34b-code ``` which doesn't work for me because of the http proxy but it says where to get the manifest ```text Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/codellama/manifests/34b-code": dial tcp: lookup registry.ollama.ai: no such host ``` But then if I try to retrieve with curl (which I've configured to be aware of the http proxy) using the url mentioned in the output ```zsh curl https://registry.ollama.ai/v2/library/codellama/manifests/34b-code ``` I get this error: ```text {"errors":[{"code":"MANIFEST_INVALID","message":"manifest invalid","detail":{}}]} ``` I would like that a small shell script could be included with ollama, that will take the name of a model to pull and then uses `curl` to pull the manifest and the model files, so that it is possible to pull via http proxy. The script only needs to use curl, and does not need to be written to account for http proxy. Local configuration of curl will apply. So in theory it should be pretty straight forward to write such a script for anyone that knows the correct URL to pull manifest from etc. (And for extracting any data from json responses, `jq` can be used in the script.)
Author
Owner

@mxyng commented on GitHub (Oct 12, 2023):

Pulling behind a proxy should be fixed in the next release.

<!-- gh-comment-id:1760018787 --> @mxyng commented on GitHub (Oct 12, 2023): Pulling behind a proxy should be fixed in the next release.
Author
Owner

@beettlle commented on GitHub (Oct 19, 2023):

Just curious @mxyng , by "next release" do you mean v0.1.4 ?

<!-- gh-comment-id:1771610540 --> @beettlle commented on GitHub (Oct 19, 2023): Just curious @mxyng , by "next release" do you mean [v0.1.4](https://github.com/jmorganca/ollama/releases/tag/v0.1.4) ?
Author
Owner

@mxyng commented on GitHub (Oct 19, 2023):

No, v0.1.2 should contain this change

<!-- gh-comment-id:1771620743 --> @mxyng commented on GitHub (Oct 19, 2023): No, [v0.1.2](https://github.com/jmorganca/ollama/releases/tag/v0.1.2) should contain this change
Author
Owner

@beettlle commented on GitHub (Oct 19, 2023):

Thank you, sadly this fix doesn't seem to fix all corner cases.

:~/github$ ollama -v
ollama version 0.1.3
:~/github$ ollama pull llama2
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": dial tcp: lookup registry.ollama.ai on xxx.xxx.xxx.xxx:53: read udp xxx.xxx.xxx.xxx:42810->xxx.xxx.xxx.xxx:53: i/o timeout
:~/github$ curl https://registry.ollama.ai/v2/library/llama2/manifests/latest
{"errors":[{"code":"MANIFEST_INVALID","message":"manifest invalid","detail":{}}]}

Should I open up a new issue?

<!-- gh-comment-id:1771632176 --> @beettlle commented on GitHub (Oct 19, 2023): Thank you, sadly this fix doesn't seem to fix all corner cases. ``` :~/github$ ollama -v ollama version 0.1.3 :~/github$ ollama pull llama2 pulling manifest Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": dial tcp: lookup registry.ollama.ai on xxx.xxx.xxx.xxx:53: read udp xxx.xxx.xxx.xxx:42810->xxx.xxx.xxx.xxx:53: i/o timeout :~/github$ curl https://registry.ollama.ai/v2/library/llama2/manifests/latest {"errors":[{"code":"MANIFEST_INVALID","message":"manifest invalid","detail":{}}]} ``` Should I open up a new issue?
Author
Owner

@reactivetype commented on GitHub (Oct 25, 2023):

I have installed 0.1.5 and still got this issue. Please reopen the issue. Or should I create a new one?

<!-- gh-comment-id:1779817836 --> @reactivetype commented on GitHub (Oct 25, 2023): I have installed `0.1.5` and still got this issue. Please reopen the issue. Or should I create a new one?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#46878