[GH-ISSUE #10412] Ollama won't run anything - ssh: no key found #32604

Closed
opened 2026-04-22 14:05:10 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @rockenman1234 on GitHub (Apr 25, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10412

What is the issue?

Hey Y'all!

I downloaded Ollama from the installation script, and seem to be having an issue on Fedora Linux 42. I was having issues with SELinux earlier, but I fixed those with the commands recommended from SELinux Troubleshooter. This fixed my issue for all of 10 minutes until I restarted my PC. Since then, nothing from SELinux has popped up.

Now when I try to run any model, I get an error about my SSH keys not being found. I do have them (both RSA and ed255519) and have even used them with Git, but I couldn't find anything online about how to fix this error.

The expected output was to launch a prompt with my chosen LLM, but now I keep getting the same error pulling manifest Error: pull model manifest: ssh: no key found.

I've tried reinstalling Ollama from scratch and that didn't work, any help is greatly appreciated - thanks!

Relevant log output

┬─[alex@Aloha:~][02:54:31 PM]
╰─>$ systemctl status ollama
● ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: disabled)
    Drop-In: /usr/lib/systemd/system/service.d
             └─10-timeout-abort.conf, 50-keep-warm.conf
     Active: active (running) since Fri 2025-04-25 14:53:03 EDT; 2min 6s ago
 Invocation: 7ed2e374ed66424e82ac3b84ce3338dd
   Main PID: 9355 (ollama)
      Tasks: 17 (limit: 37184)
     Memory: 14.3M (peak: 16M)
        CPU: 43ms
     CGroup: /system.slice/ollama.service
             └─9355 /usr/local/bin/ollama serve

Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.903-04:00 level=WARN source=amd_linux.go:376 msg="amdgpu is not supported (supported type>
Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.903-04:00 level=WARN source=amd_linux.go:383 msg="See https://github.com/ollama/ollama/bl>
Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.904-04:00 level=INFO source=amd_linux.go:296 msg="unsupported Radeon iGPU detected skippi>
Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.904-04:00 level=INFO source=amd_linux.go:402 msg="no compatible amdgpu devices detected"
Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.904-04:00 level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered"
Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.904-04:00 level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant>
Apr 25 14:53:18 Aloha ollama[9355]: [GIN] 2025/04/25 - 14:53:18 | 200 |      38.634µs |       127.0.0.1 | HEAD     "/"
Apr 25 14:53:18 Aloha ollama[9355]: [GIN] 2025/04/25 - 14:53:18 | 404 |     173.761µs |       127.0.0.1 | POST     "/api/show"
Apr 25 14:53:18 Aloha ollama[9355]: [GIN] 2025/04/25 - 14:53:18 | 200 |  192.799738ms |       127.0.0.1 | POST     "/api/pull"
Apr 25 14:54:31 Aloha ollama[9355]: [GIN] 2025/04/25 - 14:54:31 | 200 |      32.272µs |       127.0.0.1 | GET      "/api/version"
┬─[alex@Aloha:~]─[02:55:17 PM]
╰─>$ ollama pull granite3-moe:3b
pulling manifest 
Error: pull model manifest: ssh: no key found
┬─[alex@Aloha:~]─[02:55:25 PM]
╰─>$ ollama run granite3-moe:3b
pulling manifest 
Error: pull model manifest: ssh: no key found
┬─[alex@Aloha:~]─[02:58:17 PM]
╰─>$ ls -al /home/alex/.ssh/
total 16
drwx------. 1 alex alex   80 Apr 25 14:45 ./
drwx------. 1 alex alex  510 Apr 25 14:44 ../
-rw-------. 1 alex alex  419 Apr 25 14:50 id_ed25519
-rw-r--r--. 1 alex alex  104 Apr 25 14:50 id_ed25519.pub
-rw-------. 1 alex alex 3389 Apr 25 14:45 id_rsa
-rw-r--r--. 1 alex alex  748 Apr 25 14:45 id_rsa.pub
┬─[alex@Aloha:~]─[03:00:18 PM]
╰─>$ cat /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/lib64/ccache:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/var/lib/snapd/snap/bin"

[Install]
WantedBy=default.target
┬─[alex@Aloha:~]─[03:01:43 PM]
╰─>$ which ollama
/usr/local/bin/ollama

OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.6.6

Originally created by @rockenman1234 on GitHub (Apr 25, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10412 ### What is the issue? Hey Y'all! I downloaded Ollama from the installation script, and seem to be having an issue on Fedora Linux 42. I was having issues with SELinux earlier, but I fixed those with the commands recommended from SELinux Troubleshooter. This fixed my issue for all of 10 minutes until I restarted my PC. Since then, nothing from SELinux has popped up. Now when I try to run any model, I get an error about my SSH keys not being found. I do have them (both RSA and ed255519) and have even used them with Git, but I couldn't find anything online about how to fix this error. The expected output was to launch a prompt with my chosen LLM, but now I keep getting the same error `pulling manifest Error: pull model manifest: ssh: no key found`. I've tried reinstalling Ollama from scratch and that didn't work, any help is greatly appreciated - thanks! ### Relevant log output ```shell ┬─[alex@Aloha:~]─[02:54:31 PM] ╰─>$ systemctl status ollama ● ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; preset: disabled) Drop-In: /usr/lib/systemd/system/service.d └─10-timeout-abort.conf, 50-keep-warm.conf Active: active (running) since Fri 2025-04-25 14:53:03 EDT; 2min 6s ago Invocation: 7ed2e374ed66424e82ac3b84ce3338dd Main PID: 9355 (ollama) Tasks: 17 (limit: 37184) Memory: 14.3M (peak: 16M) CPU: 43ms CGroup: /system.slice/ollama.service └─9355 /usr/local/bin/ollama serve Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.903-04:00 level=WARN source=amd_linux.go:376 msg="amdgpu is not supported (supported type> Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.903-04:00 level=WARN source=amd_linux.go:383 msg="See https://github.com/ollama/ollama/bl> Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.904-04:00 level=INFO source=amd_linux.go:296 msg="unsupported Radeon iGPU detected skippi> Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.904-04:00 level=INFO source=amd_linux.go:402 msg="no compatible amdgpu devices detected" Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.904-04:00 level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered" Apr 25 14:53:03 Aloha ollama[9355]: time=2025-04-25T14:53:03.904-04:00 level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant> Apr 25 14:53:18 Aloha ollama[9355]: [GIN] 2025/04/25 - 14:53:18 | 200 | 38.634µs | 127.0.0.1 | HEAD "/" Apr 25 14:53:18 Aloha ollama[9355]: [GIN] 2025/04/25 - 14:53:18 | 404 | 173.761µs | 127.0.0.1 | POST "/api/show" Apr 25 14:53:18 Aloha ollama[9355]: [GIN] 2025/04/25 - 14:53:18 | 200 | 192.799738ms | 127.0.0.1 | POST "/api/pull" Apr 25 14:54:31 Aloha ollama[9355]: [GIN] 2025/04/25 - 14:54:31 | 200 | 32.272µs | 127.0.0.1 | GET "/api/version" ┬─[alex@Aloha:~]─[02:55:17 PM] ╰─>$ ollama pull granite3-moe:3b pulling manifest Error: pull model manifest: ssh: no key found ┬─[alex@Aloha:~]─[02:55:25 PM] ╰─>$ ollama run granite3-moe:3b pulling manifest Error: pull model manifest: ssh: no key found ┬─[alex@Aloha:~]─[02:58:17 PM] ╰─>$ ls -al /home/alex/.ssh/ total 16 drwx------. 1 alex alex 80 Apr 25 14:45 ./ drwx------. 1 alex alex 510 Apr 25 14:44 ../ -rw-------. 1 alex alex 419 Apr 25 14:50 id_ed25519 -rw-r--r--. 1 alex alex 104 Apr 25 14:50 id_ed25519.pub -rw-------. 1 alex alex 3389 Apr 25 14:45 id_rsa -rw-r--r--. 1 alex alex 748 Apr 25 14:45 id_rsa.pub ┬─[alex@Aloha:~]─[03:00:18 PM] ╰─>$ cat /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/usr/lib64/ccache:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/var/lib/snapd/snap/bin" [Install] WantedBy=default.target ┬─[alex@Aloha:~]─[03:01:43 PM] ╰─>$ which ollama /usr/local/bin/ollama ``` ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.6.6
GiteaMirror added the bug label 2026-04-22 14:05:11 -05:00
Author
Owner

@rick-github commented on GitHub (Apr 25, 2025):

What's the result of

sudo ls -al ~ollama/.ollama
<!-- gh-comment-id:2831314138 --> @rick-github commented on GitHub (Apr 25, 2025): What's the result of ``` sudo ls -al ~ollama/.ollama ```
Author
Owner

@rockenman1234 commented on GitHub (Apr 25, 2025):

What's the result of

sudo ls -al ~ollama/.ollama

Thanks for the reply! Result is below:

┬─[alex@Aloha:~]─[03:53:38 PM]
╰─>$ sudo ls -al ~ollama/.ollama
[sudo] password for alex: 
total 0
drwxr-xr-x. 1 ollama ollama  32 Apr 25 14:26 .
drwx------. 1 ollama ollama 124 Apr 25 14:20 ..
-rw-------. 1 ollama ollama   0 Apr 25 14:26 id_ed25519
drwxr-xr-x. 1 ollama ollama  28 Apr 25 15:19 models
┬─[alex@Aloha:~]─[03:53:51 PM]
╰─>$ 
<!-- gh-comment-id:2831316826 --> @rockenman1234 commented on GitHub (Apr 25, 2025): > What's the result of > > ``` > sudo ls -al ~ollama/.ollama > ``` Thanks for the reply! Result is below: ``` ┬─[alex@Aloha:~]─[03:53:38 PM] ╰─>$ sudo ls -al ~ollama/.ollama [sudo] password for alex: total 0 drwxr-xr-x. 1 ollama ollama 32 Apr 25 14:26 . drwx------. 1 ollama ollama 124 Apr 25 14:20 .. -rw-------. 1 ollama ollama 0 Apr 25 14:26 id_ed25519 drwxr-xr-x. 1 ollama ollama 28 Apr 25 15:19 models ┬─[alex@Aloha:~]─[03:53:51 PM] ╰─>$ ```
Author
Owner

@rick-github commented on GitHub (Apr 25, 2025):

sudo rm ~ollama/.ollama/id_ed25519
sudo systemctl stop ollama
sudo systemctl start ollama
sudo ls -al ~ollama/.ollama
<!-- gh-comment-id:2831326732 --> @rick-github commented on GitHub (Apr 25, 2025): ``` sudo rm ~ollama/.ollama/id_ed25519 sudo systemctl stop ollama sudo systemctl start ollama sudo ls -al ~ollama/.ollama ```
Author
Owner

@rockenman1234 commented on GitHub (Apr 25, 2025):

That worked, thank you! I was able to get ollama working again after deleting that SSH key.

<!-- gh-comment-id:2831331020 --> @rockenman1234 commented on GitHub (Apr 25, 2025): That worked, thank you! I was able to get ollama working again after deleting that SSH key.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#32604