[GH-ISSUE #729] Unable to pull models behind the proxy #46848

Closed
opened 2026-04-28 00:52:12 -05:00 by GiteaMirror · 20 comments
Owner

Originally created by @ilyanoskov on GitHub (Oct 7, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/729

Dear Maintainers,

Thank you very much for creating this project!

I need to set up ollama on Linux behind a proxy, and when pulling I get an error:

download.go:166: couldn't download blob: Get "https:///...../ollama/docker/registry/v2/blobs/...": tls: first record does not look like a TLS handshake

I have tried these methods and they also did not work for me:

Could you please add a way to configure a proxy for Ollama? That will enable a lot of users that must use a proxy.

Thank you very much in advance.

Originally created by @ilyanoskov on GitHub (Oct 7, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/729 Dear Maintainers, Thank you very much for creating this project! I need to set up ollama on Linux behind a proxy, and when pulling I get an error: ```download.go:166: couldn't download blob: Get "https:///...../ollama/docker/registry/v2/blobs/...": tls: first record does not look like a TLS handshake``` I have tried these methods and they also did not work for me: - https://github.com/jmorganca/ollama/issues/703#issuecomment-1747857562 - https://github.com/jmorganca/ollama/issues/676#issuecomment-1744722380 - https://github.com/jmorganca/ollama/issues/697 Could you please add a way to configure a proxy for Ollama? That will enable a lot of users that must use a proxy. Thank you very much in advance.
GiteaMirror added the bug label 2026-04-28 00:52:13 -05:00
Author
Owner

@jmorganca commented on GitHub (Oct 7, 2023):

Hi @ilyanoskov, thanks for creating an issue! Will look into this – I know sometimes HTTP proxies can cause issues.

<!-- gh-comment-id:1751824302 --> @jmorganca commented on GitHub (Oct 7, 2023): Hi @ilyanoskov, thanks for creating an issue! Will look into this – I know sometimes HTTP proxies can cause issues.
Author
Owner

@reactivetype commented on GitHub (Oct 25, 2023):

@jmorganca I have installed the latest version 0.1.5 and still cannot pull any model.

Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/mistral/manifests/latest": dial tcp 34.120.132.20:443: connect: connection timed out

Please advise if it's possible to download the models manually and where we should place them. Thanks!

<!-- gh-comment-id:1779823669 --> @reactivetype commented on GitHub (Oct 25, 2023): @jmorganca I have installed the latest version `0.1.5` and still cannot pull any model. `Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/mistral/manifests/latest": dial tcp 34.120.132.20:443: connect: connection timed out` Please advise if it's possible to download the models manually and where we should place them. Thanks!
Author
Owner

@bw-Deejee commented on GitHub (Dec 7, 2023):

I have the same problem. Please help

<!-- gh-comment-id:1845641731 --> @bw-Deejee commented on GitHub (Dec 7, 2023): I have the same problem. Please help
Author
Owner

@mcgorias commented on GitHub (Jan 23, 2024):

Hi, didnt try it, but maybe you can resolve the proxy declaration issue the same way as docker daemon:

sudo nano /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"

Environment="https_proxy=http://mycorporateproxy.local:8080"  #                             <---------------- 

[Install]
WantedBy=default.target
sudo systemctl daemon-reload
sudo systemctl restart ollama.service
<!-- gh-comment-id:1906311485 --> @mcgorias commented on GitHub (Jan 23, 2024): Hi, didnt try it, but maybe you can resolve the proxy declaration issue the same way as docker daemon: ```bash sudo nano /etc/systemd/system/ollama.service ``` ```ini [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" Environment="https_proxy=http://mycorporateproxy.local:8080" # <---------------- [Install] WantedBy=default.target ``` ```bash sudo systemctl daemon-reload sudo systemctl restart ollama.service ```
Author
Owner

@rudoletz commented on GitHub (Jan 26, 2024):

I tested adding the Environment="https_proxy.. " and now I can pull models. Thanks

<!-- gh-comment-id:1912396304 --> @rudoletz commented on GitHub (Jan 26, 2024): I tested adding the Environment="https_proxy.. " and now I can pull models. Thanks
Author
Owner

@armagg commented on GitHub (May 18, 2024):

how to add a proxy in mac?

<!-- gh-comment-id:2118989766 --> @armagg commented on GitHub (May 18, 2024): how to add a proxy in mac?
Author
Owner

@Telly86 commented on GitHub (May 22, 2024):

MAC: Stop the app. Then open the console and set your HTTP_PROXY variable, start the the service with ollama serve. Open another console and pull the model.

<!-- gh-comment-id:2124824457 --> @Telly86 commented on GitHub (May 22, 2024): MAC: Stop the app. Then open the console and set your HTTP_PROXY variable, start the the service with `ollama serve`. Open another console and pull the model.
Author
Owner

@ilyanoskov commented on GitHub (May 23, 2024):

Hi @jmorganca, this issue persists. I have even tried creating a custom Docker image like described here:
https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy-in-docker

Then I do ollama run codellama:7b-instruct and I get the same error as in the original message.
And before, I was getting this error: https://github.com/ollama/ollama/issues/729#issuecomment-1779823669

<!-- gh-comment-id:2126741239 --> @ilyanoskov commented on GitHub (May 23, 2024): Hi @jmorganca, this issue persists. I have even tried creating a custom Docker image like described here: https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy-in-docker Then I do `ollama run codellama:7b-instruct` and I get the same error as in the original message. And before, I was getting this error: https://github.com/ollama/ollama/issues/729#issuecomment-1779823669
Author
Owner

@TheTomer commented on GitHub (May 29, 2024):

Hi, didnt try it, but maybe you can resolve the proxy declaration issue the same way as docker daemon:

sudo nano /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"

Environment="https_proxy=http://mycorporateproxy.local:8080"  #                             <---------------- 

[Install]
WantedBy=default.target
sudo systemctl daemon-reload
sudo systemctl restart ollama.service

This solved it for me, thanks!

<!-- gh-comment-id:2137793839 --> @TheTomer commented on GitHub (May 29, 2024): > Hi, didnt try it, but maybe you can resolve the proxy declaration issue the same way as docker daemon: > > ```shell > sudo nano /etc/systemd/system/ollama.service > ``` > > ```ini > [Unit] > Description=Ollama Service > After=network-online.target > > [Service] > ExecStart=/usr/local/bin/ollama serve > User=ollama > Group=ollama > Restart=always > RestartSec=3 > Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" > > Environment="https_proxy=http://mycorporateproxy.local:8080" # <---------------- > > [Install] > WantedBy=default.target > ``` > > ```shell > sudo systemctl daemon-reload > sudo systemctl restart ollama.service > ``` This solved it for me, thanks!
Author
Owner

@AmarkanthJinna commented on GitHub (May 29, 2024):

pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3/manifests/latest": dial tcp 104.21.75.227:443: I/o timed out

Running behind corporate proxy on linux

<!-- gh-comment-id:2138185116 --> @AmarkanthJinna commented on GitHub (May 29, 2024): pulling manifest Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3/manifests/latest": dial tcp 104.21.75.227:443: I/o timed out Running behind corporate proxy on linux
Author
Owner

@janukarhisa commented on GitHub (Jun 5, 2024):

Hi, didnt try it, but maybe you can resolve the proxy declaration issue the same way as docker daemon:

sudo nano /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"

Environment="https_proxy=http://mycorporateproxy.local:8080"  #                             <---------------- 

[Install]
WantedBy=default.target
sudo systemctl daemon-reload
sudo systemctl restart ollama.service

This solved it for me, thanks!

I applied this, and now the error has changed from “Timed out” to “403” What am I missing?

<!-- gh-comment-id:2149104133 --> @janukarhisa commented on GitHub (Jun 5, 2024): > > Hi, didnt try it, but maybe you can resolve the proxy declaration issue the same way as docker daemon: > > ```shell > > sudo nano /etc/systemd/system/ollama.service > > ``` > > > > > > > > > > > > > > > > > > > > > > > > ```ini > > [Unit] > > Description=Ollama Service > > After=network-online.target > > > > [Service] > > ExecStart=/usr/local/bin/ollama serve > > User=ollama > > Group=ollama > > Restart=always > > RestartSec=3 > > Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" > > > > Environment="https_proxy=http://mycorporateproxy.local:8080" # <---------------- > > > > [Install] > > WantedBy=default.target > > ``` > > > > > > > > > > > > > > > > > > > > > > > > ```shell > > sudo systemctl daemon-reload > > sudo systemctl restart ollama.service > > ``` > > This solved it for me, thanks! I applied this, and now the error has changed from “Timed out” to “403” What am I missing?
Author
Owner

@mcgorias commented on GitHub (Jun 7, 2024):

Hi, didnt try it, but maybe you can resolve the proxy declaration issue the same way as docker daemon:

sudo nano /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"

Environment="https_proxy=http://mycorporateproxy.local:8080"  #                             <---------------- 

[Install]
WantedBy=default.target
sudo systemctl daemon-reload
sudo systemctl restart ollama.service

This solved it for me, thanks!

I applied this, and now the error has changed from “Timed out” to “403” What am I missing?

Hi ,

403 use to mean "Forbidden"
Do your corporate proxy need an authentication ?

In the case your corporate proxy authentication is done thanks to NTLM, you might need to use "CNTLM" to create an intermediate unauthenticated proxy on your machine that do the link to your company's proxy and inject your credentials. (https://ruslanmv.com/blog/How-to-install-local-proxy-with-cntlm)

Then, provide http://localhost:3128 as proxy in /etc/systemd/system/ollama.service

Steven

<!-- gh-comment-id:2155491036 --> @mcgorias commented on GitHub (Jun 7, 2024): > > > Hi, didnt try it, but maybe you can resolve the proxy declaration issue the same way as docker daemon: > > > ```shell > > > sudo nano /etc/systemd/system/ollama.service > > > ``` > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > ```ini > > > [Unit] > > > Description=Ollama Service > > > After=network-online.target > > > > > > [Service] > > > ExecStart=/usr/local/bin/ollama serve > > > User=ollama > > > Group=ollama > > > Restart=always > > > RestartSec=3 > > > Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" > > > > > > Environment="https_proxy=http://mycorporateproxy.local:8080" # <---------------- > > > > > > [Install] > > > WantedBy=default.target > > > ``` > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > ```shell > > > sudo systemctl daemon-reload > > > sudo systemctl restart ollama.service > > > ``` > > > > > > This solved it for me, thanks! > > I applied this, and now the error has changed from “Timed out” to “403” What am I missing? Hi , 403 use to mean "Forbidden" Do your corporate proxy need an authentication ? In the case your corporate proxy authentication is done thanks to NTLM, you might need to use "CNTLM" to create an intermediate unauthenticated proxy on your machine that do the link to your company's proxy and inject your credentials. (https://ruslanmv.com/blog/How-to-install-local-proxy-with-cntlm) Then, provide `http://localhost:3128` as proxy in `/etc/systemd/system/ollama.service` Steven
Author
Owner

@huornlmj commented on GitHub (Oct 1, 2024):

Environment="https_proxy=http://mycorporateproxy.local:8080" this worked for me - why isn't this in the documentation?

<!-- gh-comment-id:2385560792 --> @huornlmj commented on GitHub (Oct 1, 2024): ```Environment="https_proxy=http://mycorporateproxy.local:8080"``` this worked for me - why isn't this in the documentation?
Author
Owner

@Rishi2674 commented on GitHub (Feb 6, 2025):

proxy is set in my environment. Still isn't working and I am using a windows system. Any solutions?

<!-- gh-comment-id:2640764324 --> @Rishi2674 commented on GitHub (Feb 6, 2025): proxy is set in my environment. Still isn't working and I am using a windows system. Any solutions?
Author
Owner

@Seeumt commented on GitHub (Feb 10, 2025):

MAC: Stop the app. Then open the console and set your HTTP_PROXY variable, start the the service with ollama serve. Open another console and pull the model.

hi could you show the completed command about 'ollama serve ... '?
thanks so much

<!-- gh-comment-id:2647220633 --> @Seeumt commented on GitHub (Feb 10, 2025): > MAC: Stop the app. Then open the console and set your HTTP_PROXY variable, start the the service with `ollama serve`. Open another console and pull the model. hi could you show the completed command about 'ollama serve ... '? thanks so much
Author
Owner

@winnstorm commented on GitHub (Feb 13, 2025):

Hi, didnt try it, but maybe you can resolve the proxy declaration issue the same way as docker daemon:

sudo nano /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"

Environment="https_proxy=http://mycorporateproxy.local:8080" # <----------------

[Install]
WantedBy=default.target
sudo systemctl daemon-reload
sudo systemctl restart ollama.service

hero tip. Thanks

<!-- gh-comment-id:2657701467 --> @winnstorm commented on GitHub (Feb 13, 2025): > Hi, didnt try it, but maybe you can resolve the proxy declaration issue the same way as docker daemon: > > sudo nano /etc/systemd/system/ollama.service > [Unit] > Description=Ollama Service > After=network-online.target > > [Service] > ExecStart=/usr/local/bin/ollama serve > User=ollama > Group=ollama > Restart=always > RestartSec=3 > Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin" > > Environment="https_proxy=http://mycorporateproxy.local:8080" # <---------------- > > [Install] > WantedBy=default.target > sudo systemctl daemon-reload > sudo systemctl restart ollama.service hero tip. Thanks
Author
Owner

@trouvaillle commented on GitHub (Mar 21, 2025):

it may help:
when using self-signed https proxy, don't use https://mycorporateproxy.local/;
use http://mycorporateproxy.local:443/

<!-- gh-comment-id:2743599541 --> @trouvaillle commented on GitHub (Mar 21, 2025): it may help: when using self-signed https proxy, don't use `https://mycorporateproxy.local/`; use `http://mycorporateproxy.local:443/`
Author
Owner

@WinkelB commented on GitHub (Apr 4, 2025):

it may help: when using self-signed https proxy, don't use https://mycorporateproxy.local/; use http://mycorporateproxy.local:443/

This is exactly what works for me:
I only use the HTTPS_PROXY variable and set it to http://proxyname:port

<!-- gh-comment-id:2777840346 --> @WinkelB commented on GitHub (Apr 4, 2025): > it may help: when using self-signed https proxy, don't use `https://mycorporateproxy.local/`; use `http://mycorporateproxy.local:443/` This is exactly what works for me: I only use the HTTPS_PROXY variable and set it to http://proxyname:port
Author
Owner

@huornlmj commented on GitHub (Jul 26, 2025):

Take note that if you use the systemd service file to store your proxy settings, if you later perform an upgrade of Ollama using the standard curl based install script it will clobber the config file and you will lose your proxy settings. You wil need to take a backup of that file before you upgrade.

<!-- gh-comment-id:3121489619 --> @huornlmj commented on GitHub (Jul 26, 2025): Take note that if you use the systemd service file to store your proxy settings, if you later perform an upgrade of Ollama using the standard curl based install script it will clobber the config file and you will lose your proxy settings. You wil need to take a backup of that file before you upgrade.
Author
Owner

@sascha-kirch commented on GitHub (Aug 16, 2025):

Take note that if you use the systemd service file to store your proxy settings, if you later perform an upgrade of Ollama using the standard curl based install script it will clobber the config file and you will lose your proxy settings. You wil need to take a backup of that file before you upgrade.

Here a small script that installs/updates ollama and updates the service with the current proxy setting.
69ea239b40/install_ollama.sh

#!/bin/bash

echo "Installing/Updating ollama..."
sh -c "$(curl -fsSL https://ollama.com/install.sh)"


# Update configs of ollama service
# Note that the override.conf is merged with the default service file
OLLAMA_SERVICE_DIR="/etc/systemd/system/ollama.service.d"

if [ ! -d "$OLLAMA_SERVICE_DIR" ]; then
    echo "Creating directory for ollama service overrides..."
    sudo mkdir -p "$OLLAMA_SERVICE_DIR"
else
    echo "Directory for ollama service overrides already exists."
fi


# using tee to be able to combine it with sudo
echo "[Service]" | sudo tee $OLLAMA_SERVICE_DIR/override.conf # this will overwrite an existing file
echo "Environment=\"https_proxy=$HTTP_PROXY\"" | sudo tee -a $OLLAMA_SERVICE_DIR/override.conf # -a for appending

# restart service
sudo systemctl daemon-reload
sudo systemctl restart ollama.service
<!-- gh-comment-id:3193766325 --> @sascha-kirch commented on GitHub (Aug 16, 2025): > Take note that if you use the systemd service file to store your proxy settings, if you later perform an upgrade of Ollama using the standard curl based install script it will clobber the config file and you will lose your proxy settings. You wil need to take a backup of that file before you upgrade. Here a small script that installs/updates ollama and updates the service with the current proxy setting. https://github.com/sascha-kirch/linux-forgeup/blob/69ea239b40e3a0d0ab6bf978fad49f2ec7ccb177/install_ollama.sh ```bash #!/bin/bash echo "Installing/Updating ollama..." sh -c "$(curl -fsSL https://ollama.com/install.sh)" # Update configs of ollama service # Note that the override.conf is merged with the default service file OLLAMA_SERVICE_DIR="/etc/systemd/system/ollama.service.d" if [ ! -d "$OLLAMA_SERVICE_DIR" ]; then echo "Creating directory for ollama service overrides..." sudo mkdir -p "$OLLAMA_SERVICE_DIR" else echo "Directory for ollama service overrides already exists." fi # using tee to be able to combine it with sudo echo "[Service]" | sudo tee $OLLAMA_SERVICE_DIR/override.conf # this will overwrite an existing file echo "Environment=\"https_proxy=$HTTP_PROXY\"" | sudo tee -a $OLLAMA_SERVICE_DIR/override.conf # -a for appending # restart service sudo systemctl daemon-reload sudo systemctl restart ollama.service ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#46848