[GH-ISSUE #4901] Error: pull model manifest: ssh: no key found #65132

Closed
opened 2026-05-03 19:49:25 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @674316 on GitHub (Jun 7, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4901

What is the issue?

ollama pull vicuna

pulling manifest
Error: pull model manifest: ssh: no key found

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

ollama version is 0.1.41

Originally created by @674316 on GitHub (Jun 7, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4901 ### What is the issue? ollama pull vicuna >> pulling manifest Error: pull model manifest: ssh: no key found ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version ollama version is 0.1.41
GiteaMirror added the networkingbug labels 2026-05-03 19:49:25 -05:00
Author
Owner

@674316 commented on GitHub (Jun 7, 2024):

2024/06/07 17:32:33 routes.go:1007: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST: OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS: OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR:C:\Users\admin\AppData\Local\Programs\Ollama\ollama_runners OLLAMA_TMPDIR:]"
time=2024-06-07T17:32:34.008+08:00 level=INFO source=images.go:729 msg="total blobs: 0"
time=2024-06-07T17:32:34.008+08:00 level=INFO source=images.go:736 msg="total unused blobs removed: 0"
time=2024-06-07T17:32:34.009+08:00 level=INFO source=routes.go:1053 msg="Listening on 127.0.0.1:11434 (version 0.1.41)"
time=2024-06-07T17:32:34.009+08:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu_avx cpu_avx2 cuda_v11.3 rocm_v5.7 cpu]"
time=2024-06-07T17:32:34.223+08:00 level=INFO source=types.go:71 msg="inference compute" id=GPU-b7b9745c-884e-03a1-4b88-b882bb50d3bd library=cuda compute=8.6 driver=12.2 name="NVIDIA GeForce RTX 3060 Ti" total="8.0 GiB" available="7.0 GiB"
[GIN] 2024/06/07 - 17:32:34 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/06/07 - 17:32:34 | 404 | 512.4µs | 127.0.0.1 | POST "/api/show"
[GIN] 2024/06/07 - 17:32:35 | 200 | 827.518ms | 127.0.0.1 | POST "/api/pull"

<!-- gh-comment-id:2154487236 --> @674316 commented on GitHub (Jun 7, 2024): 2024/06/07 17:32:33 routes.go:1007: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST: OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS: OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR:C:\\Users\\admin\\AppData\\Local\\Programs\\Ollama\\ollama_runners OLLAMA_TMPDIR:]" time=2024-06-07T17:32:34.008+08:00 level=INFO source=images.go:729 msg="total blobs: 0" time=2024-06-07T17:32:34.008+08:00 level=INFO source=images.go:736 msg="total unused blobs removed: 0" time=2024-06-07T17:32:34.009+08:00 level=INFO source=routes.go:1053 msg="Listening on 127.0.0.1:11434 (version 0.1.41)" time=2024-06-07T17:32:34.009+08:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu_avx cpu_avx2 cuda_v11.3 rocm_v5.7 cpu]" time=2024-06-07T17:32:34.223+08:00 level=INFO source=types.go:71 msg="inference compute" id=GPU-b7b9745c-884e-03a1-4b88-b882bb50d3bd library=cuda compute=8.6 driver=12.2 name="NVIDIA GeForce RTX 3060 Ti" total="8.0 GiB" available="7.0 GiB" [GIN] 2024/06/07 - 17:32:34 | 200 | 0s | 127.0.0.1 | HEAD "/" [GIN] 2024/06/07 - 17:32:34 | 404 | 512.4µs | 127.0.0.1 | POST "/api/show" [GIN] 2024/06/07 - 17:32:35 | 200 | 827.518ms | 127.0.0.1 | POST "/api/pull"
Author
Owner

@malteneuss commented on GitHub (Jun 23, 2024):

Any idea what could be the issue? I'm running Ollama in NixOS after a fresh reinstall and it doesn't work anymore. Tried version 1.38 and 1.45.

edit:
I don't know what went wrong, but i was able to get it to work again:
Apparently, Ollama generates a public-private key pair to download models from the Ollama registry. By deleteting the existing key (var/lib/ollama/.ollama/id_ed25519 and var/lib/ollama/.ollama/id_ed25519.pub), i was able to let Ollama generate a new pair:

Couldn't find '/var/lib/ollama/.ollama/id_ed25519'. Generating new private key.
<!-- gh-comment-id:2184871760 --> @malteneuss commented on GitHub (Jun 23, 2024): Any idea what could be the issue? I'm running Ollama in NixOS after a fresh reinstall and it doesn't work anymore. Tried version 1.38 and 1.45. edit: I don't know what went wrong, but i was able to get it to work again: Apparently, Ollama generates a public-private key pair to download models from the Ollama registry. By deleteting the existing key (`var/lib/ollama/.ollama/id_ed25519` and `var/lib/ollama/.ollama/id_ed25519.pub`), i was able to let Ollama generate a new pair: ``` Couldn't find '/var/lib/ollama/.ollama/id_ed25519'. Generating new private key. ```
Author
Owner

@AnimationFlow commented on GitHub (Jul 10, 2024):

I also couldn't pull Ollama models because of the SSH error, but managed to fix it.

I opened id_ed25519 in VSC and saw a lot of symbols like a ? in a rhombus.
So I figured it was corrupted and created a new key using:

ssh-keygen -t rsa -b 4096 -C "my.email@gmail.com"

New key files were in my_user_folder/.ssh

Then I replaced old files:

  • id_ed25519
  • id_ed25519.pub

with newly generated (renaming them to match old names)
and it worked like a charm.

<!-- gh-comment-id:2221610244 --> @AnimationFlow commented on GitHub (Jul 10, 2024): I also couldn't pull Ollama models because of the SSH error, but managed to fix it. I opened id_ed25519 in VSC and saw a lot of symbols like a ? in a rhombus. So I figured it was **_corrupted_** and created a new key using: `ssh-keygen -t rsa -b 4096 -C "my.email@gmail.com"` New key files were in my_user_folder/.ssh Then I replaced old files: - id_ed25519 - id_ed25519.pub with newly generated (renaming them to match old names) and it worked like a charm.
Author
Owner

@pdevine commented on GitHub (Jul 11, 2024):

@silviodonlic RSA keys aren't supported if you want to be able to push a model to ollama.com. The solution here is just to remove the two corrupted files and restart the ollama server which will automatically create new ed25519 keys (what @malteneuss did).

I would like to know how the keys got corrupted though. The keys are supposed to be in PEM format which should be in plain text. They're also only created in one place, so I'm not quite sure how there could be a race condition there. I'm going to go ahead and close the issue since there's a simple work around, but if anyone can reproduce this I'm happy to reopen and try to trouble shoot.

<!-- gh-comment-id:2221875545 --> @pdevine commented on GitHub (Jul 11, 2024): @silviodonlic RSA keys aren't supported if you want to be able to push a model to ollama.com. The solution here is just to remove the two corrupted files and restart the ollama server which will automatically create new ed25519 keys (what @malteneuss did). I would like to know how the keys got corrupted though. The keys are _supposed_ to be in PEM format which should be in plain text. They're also only created in one place, so I'm not quite sure how there could be a race condition there. I'm going to go ahead and close the issue since there's a simple work around, but if anyone can reproduce this I'm happy to reopen and try to trouble shoot.
Author
Owner

@user2745 commented on GitHub (Oct 20, 2024):

I also got this same issue, running my OrangePi desktop PC

<!-- gh-comment-id:2425138911 --> @user2745 commented on GitHub (Oct 20, 2024): I also got this same issue, running my OrangePi desktop PC
Author
Owner

@tariqhawis commented on GitHub (Jul 3, 2025):

I fixed this error as follows:

  1. Copy ~/.ollama directory to /usr/share/ollama/
  2. Restart the service: systemctl restart ollama
  3. run again: ollama pull ...
<!-- gh-comment-id:3033330223 --> @tariqhawis commented on GitHub (Jul 3, 2025): I fixed this error as follows: 1. Copy ~/.ollama directory to /usr/share/ollama/ 2. Restart the service: systemctl restart ollama 3. run again: ollama pull ...
Author
Owner

@VladyslavSan commented on GitHub (Jul 16, 2025):

Just run ollama serve, it will fail but in the meantime it will create a new keys and everything will work just fine.

<!-- gh-comment-id:3079696696 --> @VladyslavSan commented on GitHub (Jul 16, 2025): Just run `ollama serve`, it will fail but in the meantime it will create a new keys and everything will work just fine.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#65132