[GH-ISSUE #857] Manually download and upload models #62450

Closed
opened 2026-05-03 09:00:54 -05:00 by GiteaMirror · 12 comments
Owner

Originally created by @dawnpatrol04 on GitHub (Oct 20, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/857

hey guys. Having issues getting with this part a work with corporate proxy: docker exec -it ollama ollama run llama2.

2 issues.

  1. When I set a proxy something breaks.

  2. model url / cert not allowed / blocked. To work around this I will need to manually download model files upload to the container.

Can we manually download and upload model files? Where do I put the model files after I have download them?

Originally created by @dawnpatrol04 on GitHub (Oct 20, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/857 hey guys. Having issues getting with this part a work with corporate proxy: docker exec -it ollama ollama run llama2. 2 issues. 1) When I set a proxy something breaks. 2) model url / cert not allowed / blocked. To work around this I will need to manually download model files upload to the container. Can we manually download and upload model files? Where do I put the model files after I have download them?
GiteaMirror added the bug label 2026-05-03 09:00:54 -05:00
Author
Owner

@mxyng commented on GitHub (Oct 25, 2023):

You can set HTTP_PROXY or HTTPS_PROXY when starting the ollama docker container.

docker run -d -e HTTP_PROXY -e HTTPS_PROXY ollama/ollama

You can also set it globally for all docker contains. See https://docs.docker.com/network/proxy/

<!-- gh-comment-id:1779871058 --> @mxyng commented on GitHub (Oct 25, 2023): You can set `HTTP_PROXY` or `HTTPS_PROXY` when starting the ollama docker container. ``` docker run -d -e HTTP_PROXY -e HTTPS_PROXY ollama/ollama ``` You can also set it globally for all docker contains. See https://docs.docker.com/network/proxy/
Author
Owner

@jmorganca commented on GitHub (Oct 30, 2023):

Hi @dawnpatrol04, as @mxyng mentioned this may be from not configuring Ollama to use your http or https proxy. If this doesn't fix the issue please feel free to re-open!

<!-- gh-comment-id:1786121973 --> @jmorganca commented on GitHub (Oct 30, 2023): Hi @dawnpatrol04, as @mxyng mentioned this may be from not configuring Ollama to use your http or https proxy. If this doesn't fix the issue please feel free to re-open!
Author
Owner

@reactivetype commented on GitHub (Nov 3, 2023):

@dawnpatrol04 @jmorganca this issue is still not solved. There many open issues about proxy and it'd be great if you provide a way and documentation to download the models manually.

<!-- gh-comment-id:1792864180 --> @reactivetype commented on GitHub (Nov 3, 2023): @dawnpatrol04 @jmorganca this issue is still not solved. There many open issues about proxy and it'd be great if you provide a way and documentation to download the models manually.
Author
Owner

@nmstoker commented on GitHub (Feb 18, 2024):

I second reactivetype's comment. Can this be reopened?

It would be wise to park the proxy point completely and just confirm if there's a way to download the models manually as this is often necessary in corporate networks, some of which will have areas where there is literally no internet access allowed.

<!-- gh-comment-id:1951406977 --> @nmstoker commented on GitHub (Feb 18, 2024): I second reactivetype's comment. Can this be reopened? It would be wise to park the proxy point completely and just **confirm if there's a way to download the models manually** as this is often necessary in corporate networks, some of which will have areas where there is literally no internet access allowed.
Author
Owner

@nmstoker commented on GitHub (Feb 18, 2024):

I'm guessing the answer may involve putting files in the location here:

https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored

I'll see if I can figure it out from common sense and a bit of tinkering

<!-- gh-comment-id:1951409054 --> @nmstoker commented on GitHub (Feb 18, 2024): I'm guessing the answer may involve putting files in the location here: https://github.com/ollama/ollama/blob/main/docs/faq.md#where-are-models-stored I'll see if I can figure it out from common sense and a bit of tinkering
Author
Owner

@dhirajsuvarna commented on GitHub (Mar 4, 2024):

did someone figure this out?

<!-- gh-comment-id:1975787178 --> @dhirajsuvarna commented on GitHub (Mar 4, 2024): did someone figure this out?
Author
Owner

@sven700c commented on GitHub (Apr 5, 2024):

I second reactivetype's comment. Can this be reopened?

It would be wise to park the proxy point completely and just confirm if there's a way to download the models manually as this is often necessary in corporate networks, some of which will have areas where there is literally no internet access allowed.

Exactly this! There is not Internet access allowed at all on the machine I'm trying to run this and other machines only get https access thru a web browser, so setting https_proxy env var for the command line won't help.

<!-- gh-comment-id:2038767065 --> @sven700c commented on GitHub (Apr 5, 2024): > I second reactivetype's comment. Can this be reopened? > > It would be wise to park the proxy point completely and just **confirm if there's a way to download the models manually** as this is often necessary in corporate networks, some of which will have areas where there is literally no internet access allowed. Exactly this! There is not Internet access allowed at all on the machine I'm trying to run this and other machines only get https access thru a web browser, so setting https_proxy env var for the command line won't help.
Author
Owner

@nmstoker commented on GitHub (Apr 5, 2024):

@jmorganca any chance this could be re-opened & looked at again purely in regard to manual downloading of models for the reasons given above?

<!-- gh-comment-id:2039239659 --> @nmstoker commented on GitHub (Apr 5, 2024): @jmorganca any chance this could be re-opened & looked at again purely in regard to manual downloading of models for the reasons given above?
Author
Owner

@Seedmanc commented on GitHub (Apr 12, 2024):

I'm not even in a corporative network and pulling model doesn't work at all. I wish someone made a torrent of a working Ollama installation so I would at least have something to start with, driven by example. Starting from scratch is too hard, not a single model comes with the installer itself.

<!-- gh-comment-id:2051898176 --> @Seedmanc commented on GitHub (Apr 12, 2024): I'm not even in a corporative network and pulling model doesn't work at all. I wish someone made a torrent of a working Ollama installation so I would at least have something to start with, driven by example. Starting from scratch is too hard, not a single model comes with the installer itself.
Author
Owner

@MattFriedman commented on GitHub (May 5, 2024):

I too and behind a corporate proxy. I've set the http mentioned above env vars and have no luck getting ollama to use my proxy. On the other hand, I can use curl with my proxy to communicate the the external endpoints so for certain the proxy works perfectly. I've tried running in docker, running from the cli and running from the source code. Constantly getting a bad gateway error.

I would like to download the model manually and put it in the right location if someone knows how to do that.

<!-- gh-comment-id:2094848148 --> @MattFriedman commented on GitHub (May 5, 2024): I too and behind a corporate proxy. I've set the http mentioned above env vars and have no luck getting ollama to use my proxy. On the other hand, I can use curl with my proxy to communicate the the external endpoints so for certain the proxy works perfectly. I've tried running in docker, running from the cli and running from the source code. Constantly getting a bad gateway error. I would like to download the model manually and put it in the right location if someone knows how to do that.
Author
Owner

@usathyan commented on GitHub (Jun 11, 2024):

it downloads the files as manifests and blobs. not easy to do manually. this is a great useless product unfortunately for corporate uses.

<!-- gh-comment-id:2160562079 --> @usathyan commented on GitHub (Jun 11, 2024): it downloads the files as manifests and blobs. not easy to do manually. this is a great useless product unfortunately for corporate uses.
Author
Owner

@amirrezaDev1378 commented on GitHub (Aug 28, 2024):

If anyone did not find a solution for their problem, I've created this simple app that will give you links to download and model in any size you want from the Ollama registry:

https://github.com/amirrezaDev1378/ollama-model-direct-download

<!-- gh-comment-id:2316085490 --> @amirrezaDev1378 commented on GitHub (Aug 28, 2024): If anyone did not find a solution for their problem, I've created this simple app that will give you links to download and model in any size you want from the Ollama registry: https://github.com/amirrezaDev1378/ollama-model-direct-download
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#62450