[GH-ISSUE #2622] How to set a crt file or disable the SSL verify in Windows #48060

Open
opened 2026-04-28 06:35:20 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @neuwcodebox on GitHub (Feb 21, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2622

Originally assigned to: @dhiltgen on GitHub.

Hello.
I am having a problem with 403 response from run command while trying to use the Ollama(Windows Preview) behind company proxy server.
There is nothing special left in the log, but it is obvious that it is a proxy problem.
The http(s)_proxy environment variable is set and crt certificate is installed.
i remember turning off the ssl verify option or specifying the crt file when using other programs such as pip.
Does ollama support the same option? My company is doing weird things to monitor the https connection, so there are many problems like this :/

Originally created by @neuwcodebox on GitHub (Feb 21, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2622 Originally assigned to: @dhiltgen on GitHub. Hello. I am having a problem with 403 response from run command while trying to use the Ollama(Windows Preview) behind company proxy server. There is nothing special left in the log, but it is obvious that it is a proxy problem. The http(s)_proxy environment variable is set and crt certificate is installed. **i remember turning off the ssl verify option or specifying the crt file when using other programs such as pip.** **Does ollama support the same option?** My company is doing weird things to monitor the https connection, so there are many problems like this :/
GiteaMirror added the networkingwindows labels 2026-04-28 06:35:21 -05:00
Author
Owner

@neuwcodebox commented on GitHub (Feb 22, 2024):

...or can I manually download the checkpoint file and set it in ollama?

<!-- gh-comment-id:1958732649 --> @neuwcodebox commented on GitHub (Feb 22, 2024): ...or can I manually download the checkpoint file and set it in ollama?
Author
Owner

@dhirajsuvarna commented on GitHub (Mar 4, 2024):

+1 facing the same problem

pulling manifest
Error: 403:
<!-- gh-comment-id:1975681825 --> @dhirajsuvarna commented on GitHub (Mar 4, 2024): +1 facing the same problem ``` pulling manifest Error: 403: ```
Author
Owner

@dhiltgen commented on GitHub (May 2, 2024):

Are you still experiencing problems behind a proxy? Can you try the latest release 0.1.33 and if it still doesn't work properly with the proxy settings, can you share your server log? Setting OLLAMA_DEBUG=1 may be helpful.

<!-- gh-comment-id:2091832954 --> @dhiltgen commented on GitHub (May 2, 2024): Are you still experiencing problems behind a proxy? Can you try the latest release 0.1.33 and if it still doesn't work properly with the [proxy settings](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-use-ollama-behind-a-proxy), can you share your server log? Setting OLLAMA_DEBUG=1 may be helpful.
Author
Owner

@neuwcodebox commented on GitHub (May 3, 2024):

@dhiltgen
Hello, I'm still having a trouble behind the proxy.

CLI:

>ollama run mistral
pulling manifest
Error: 403:

server.log:

time=2024-05-03T16:59:44.573+09:00 level=INFO source=images.go:828 msg="total blobs: 0"
time=2024-05-03T16:59:44.694+09:00 level=INFO source=images.go:835 msg="total unused blobs removed: 0"
time=2024-05-03T16:59:44.696+09:00 level=INFO source=routes.go:1071 msg="Listening on 127.0.0.1:11434 (version 0.1.33)"
time=2024-05-03T16:59:44.698+09:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11.3 rocm_v5.7]"
time=2024-05-03T16:59:44.698+09:00 level=INFO source=gpu.go:96 msg="Detecting GPUs"
time=2024-05-03T16:59:44.917+09:00 level=INFO source=gpu.go:101 msg="detected GPUs" library=C:\Users\user\AppData\Local\Programs\Ollama\cudart64_110.dll count=1
time=2024-05-03T16:59:44.918+09:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
[GIN] 2024/05/03 - 16:59:48 | 200 |            0s |       127.0.0.1 | HEAD     "/"
[GIN] 2024/05/03 - 16:59:48 | 404 |       584.9µs |       127.0.0.1 | POST     "/api/show"
[GIN] 2024/05/03 - 16:59:51 | 200 |    2.1054245s |       127.0.0.1 | POST     "/api/pull"

I think we need a option to disable the SSL verify or set crt file path.

<!-- gh-comment-id:2092516306 --> @neuwcodebox commented on GitHub (May 3, 2024): @dhiltgen Hello, I'm still having a trouble behind the proxy. CLI: > \>ollama run mistral > pulling manifest > Error: 403: server.log: ``` time=2024-05-03T16:59:44.573+09:00 level=INFO source=images.go:828 msg="total blobs: 0" time=2024-05-03T16:59:44.694+09:00 level=INFO source=images.go:835 msg="total unused blobs removed: 0" time=2024-05-03T16:59:44.696+09:00 level=INFO source=routes.go:1071 msg="Listening on 127.0.0.1:11434 (version 0.1.33)" time=2024-05-03T16:59:44.698+09:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11.3 rocm_v5.7]" time=2024-05-03T16:59:44.698+09:00 level=INFO source=gpu.go:96 msg="Detecting GPUs" time=2024-05-03T16:59:44.917+09:00 level=INFO source=gpu.go:101 msg="detected GPUs" library=C:\Users\user\AppData\Local\Programs\Ollama\cudart64_110.dll count=1 time=2024-05-03T16:59:44.918+09:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" [GIN] 2024/05/03 - 16:59:48 | 200 | 0s | 127.0.0.1 | HEAD "/" [GIN] 2024/05/03 - 16:59:48 | 404 | 584.9µs | 127.0.0.1 | POST "/api/show" [GIN] 2024/05/03 - 16:59:51 | 200 | 2.1054245s | 127.0.0.1 | POST "/api/pull" ``` I think we need a option to disable the SSL verify or set crt file path.
Author
Owner

@dhiltgen commented on GitHub (May 21, 2024):

Have you tried ollama pull mistral --insecure

<!-- gh-comment-id:2123162855 --> @dhiltgen commented on GitHub (May 21, 2024): Have you tried `ollama pull mistral --insecure`
Author
Owner

@neuwcodebox commented on GitHub (May 22, 2024):

@dhiltgen I tried, and It was same though.

<!-- gh-comment-id:2123681281 --> @neuwcodebox commented on GitHub (May 22, 2024): @dhiltgen I tried, and It was same though.
Author
Owner

@ibaxo commented on GitHub (May 24, 2024):

@dhiltgen I have also tried proposed --insecure option.
However, this option seems to force ollama to use HTTP instead of HTTPS. This could be a problem since HTTP traffic doesn't have to be open at all.
It would be really nice if ollama could take certificates from windows or if there would be an option to use custom certificates or possibility to disable SSL validation at all.

<!-- gh-comment-id:2129923359 --> @ibaxo commented on GitHub (May 24, 2024): @dhiltgen I have also tried proposed --insecure option. However, this option seems to force ollama to use HTTP instead of HTTPS. This could be a problem since HTTP traffic doesn't have to be open at all. It would be really nice if ollama could take certificates from windows or if there would be an option to use custom certificates or possibility to disable SSL validation at all.
Author
Owner

@dhiltgen commented on GitHub (May 24, 2024):

@ibaxo that's definitely a bug. The --insecure flag should not force http. Let me get a PR up to fix that...

<!-- gh-comment-id:2129975853 --> @dhiltgen commented on GitHub (May 24, 2024): @ibaxo that's definitely a bug. The `--insecure` flag should not force http. Let me get a PR up to fix that...
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48060