[GH-ISSUE #4834] Cannot pull models when http_proxy/HTTP_PROXY are set. #28816

Closed
opened 2026-04-22 07:22:37 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @janukarhisa on GitHub (Jun 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4834

What is the issue?

Our server is located behind a proxy. The environment variables for both the host and Docker daemon are set with http_proxy, https_proxy, HTTP_PROXY, and HTTPS_PROXY to apply proxy settings to all containers.

For testing purposes, I created a container using the following command:

docker run -d -v {host_path}:/root/.ollama ollama/ollama:latest

When I go inside the container and try to pull the Mistral model, I get the following error:

Error: something went wrong, please see the ollama server logs for details

However, if I unset the http_proxy and HTTP_PROXY environment variables, the pulling method works without any problem. What am I missing here?

OS

Linux

GPU

No response

CPU

Intel

Ollama version

0.1.41

Originally created by @janukarhisa on GitHub (Jun 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4834 ### What is the issue? Our server is located behind a proxy. The environment variables for both the host and Docker daemon are set with `http_proxy`, `https_proxy`, `HTTP_PROXY`, and `HTTPS_PROXY` to apply proxy settings to all containers. For testing purposes, I created a container using the following command: ```bash docker run -d -v {host_path}:/root/.ollama ollama/ollama:latest ``` When I go inside the container and try to pull the Mistral model, I get the following error: ``` Error: something went wrong, please see the ollama server logs for details ``` However, if I unset the `http_proxy` and `HTTP_PROXY` environment variables, the pulling method works without any problem. What am I missing here? ### OS Linux ### GPU _No response_ ### CPU Intel ### Ollama version 0.1.41
GiteaMirror added the bug label 2026-04-22 07:22:37 -05:00
Author
Owner

@mxyng commented on GitHub (Aug 23, 2024):

Ollama client uses HTTP to communicate with the server. If you've set HTTP_PROXY the container and are reusing it for the Ollama CLI, it'll forward the client requests through your proxy.

<!-- gh-comment-id:2307792955 --> @mxyng commented on GitHub (Aug 23, 2024): Ollama client uses HTTP to communicate with the server. If you've set HTTP_PROXY the container and are reusing it for the Ollama CLI, it'll forward the client requests through your proxy.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28816