[GH-ISSUE #15074] Bug: 401 Unauthorized on cloud pull and 'ollama login' command fails (Windows 11) #9670

Open
opened 2026-04-12 22:33:32 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @LucasPavs on GitHub (Mar 26, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15074

What is the issue?

Describe the bug
I am unable to authenticate my Ollama client on Windows 11. When I run ollama login, nothing happens (no browser opens). When I try to pull or run a cloud model (like minimax-m2.7:cloud or even llama3.2:1b), I get an Error: pull model manifest: 401.

To Reproduce

Install Ollama v0.18.2 on Windows 11.

Generate a new SSH key: cat ~/.ollama/id_ed25519.pub.

Add the public key manually to ollama.com/settings/keys.

Run ollama run llama3.2:1b or ollama run minimax-m2.7:cloud.

Command fails with 401 Unauthorized.

Expected behavior
The client should recognize the public key registered on the website and allow the pull, or ollama login should open the browser to verify the session.

Environment:

OS: Windows 11

Ollama Version: 0.18.2

Network: Brazilian ISP (tested with local server at 127.0.0.1:11434)

Additional Context:

I have already tried resetting the .ollama folder and regenerating keys.

OLLAMA_HOST is set to 127.0.0.1:11434.

The URL ollama.com/authorize (often triggered by login) returns a 404 error on the website.

Pulling from hf.co works fine, but official registry pulls fail with 401.

Server Debug Logs:
Even after adding the public key to my account on ollama.com, I get a 401 error. The server log shows that the /api/pull request is received, but it fails to authenticate with the remote registry.

My OLLAMA_ID is set to: C:\Users\Lukinhas.ollama\id_ed25519
My OLLAMA_HOST is: 127.0.0.1:11434

Relevant log output

time=2026-03-25T01:44:57.118-03:00 level=INFO source=routes.go:1740 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:0 OLLAMA_DEBUG:DEBUG OLLAMA_DEBUG_LOG_REQUESTS:false OLLAMA_EDITOR: OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\\Users\\Lukinhas\\.ollama\\models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NO_CLOUD:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false OLLAMA_VULKAN:false ROCR_VISIBLE_DEVICES:]"
time=2026-03-25T01:44:57.118-03:00 level=INFO source=routes.go:1742 msg="Ollama cloud disabled: false"
time=2026-03-25T01:44:57.119-03:00 level=INFO source=images.go:477 msg="total blobs: 0"
time=2026-03-25T01:44:57.119-03:00 level=INFO source=images.go:484 msg="total unused blobs removed: 0"
time=2026-03-25T01:44:57.121-03:00 level=INFO source=routes.go:1798 msg="Listening on 127.0.0.1:11434 (version 0.18.3)"
time=2026-03-25T01:44:57.122-03:00 level=DEBUG source=sched.go:145 msg="starting llm scheduler"
time=2026-03-25T01:44:57.123-03:00 level=INFO source=runner.go:67 msg="discovering available GPUs..."
time=2026-03-25T01:44:57.137-03:00 level=INFO source=server.go:432 msg="starting runner" cmd="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\ollama.exe runner --ollama-engine --port 54365"
time=2026-03-25T01:44:57.137-03:00 level=DEBUG source=server.go:433 msg=subprocess OLLAMA_DEBUG=1 OLLAMA_HOST=127.0.0.1:11434 OLLAMA_ID=C:\Users\Lukinhas\.ollama\id_ed25519 OLLAMA_NO_CLOUD=false PATH="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\cuda_v12;C:\\jdk-21.0.10\\bin;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\WINDOWS\\System32\\WindowsPowerShell\\v1.0\\;C:\\WINDOWS\\System32\\OpenSSH\\;C:\\Program Files\\Git\\cmd;C:\\Exercism;C:\\Program Files\\dotnet\\;C:\\Program Files\\nodejs\\;C:\\apache-maven-3.9.12\\bin;;C:\\Program Files\\Docker\\Docker\\resources\\bin;C:\\jdk-21.0.10\\bin;C:\\Users\\Lukinhas\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Microsoft VS Code\\bin;C:\\MinGW\\bin;C:\\Exercism;C:\\Users\\Lukinhas\\AppData\\Roaming\\npm;;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama" OLLAMA_LIBRARY_PATH=C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12
time=2026-03-25T01:44:57.664-03:00 level=DEBUG source=runner.go:437 msg="bootstrap discovery took" duration=535.23ms OLLAMA_LIBRARY_PATH="[C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\cuda_v12]" extra_envs=map[]
time=2026-03-25T01:44:57.667-03:00 level=INFO source=server.go:432 msg="starting runner" cmd="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\ollama.exe runner --ollama-engine --port 54374"
time=2026-03-25T01:44:57.668-03:00 level=DEBUG source=server.go:433 msg=subprocess OLLAMA_DEBUG=1 OLLAMA_HOST=127.0.0.1:11434 OLLAMA_ID=C:\Users\Lukinhas\.ollama\id_ed25519 OLLAMA_NO_CLOUD=false PATH="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\cuda_v13;C:\\jdk-21.0.10\\bin;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\WINDOWS\\System32\\WindowsPowerShell\\v1.0\\;C:\\WINDOWS\\System32\\OpenSSH\\;C:\\Program Files\\Git\\cmd;C:\\Exercism;C:\\Program Files\\dotnet\\;C:\\Program Files\\nodejs\\;C:\\apache-maven-3.9.12\\bin;;C:\\Program Files\\Docker\\Docker\\resources\\bin;C:\\jdk-21.0.10\\bin;C:\\Users\\Lukinhas\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Microsoft VS Code\\bin;C:\\MinGW\\bin;C:\\Exercism;C:\\Users\\Lukinhas\\AppData\\Roaming\\npm;;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama" OLLAMA_LIBRARY_PATH=C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13
time=2026-03-25T01:44:58.066-03:00 level=DEBUG source=runner.go:437 msg="bootstrap discovery took" duration=401.7361ms OLLAMA_LIBRARY_PATH="[C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\cuda_v13]" extra_envs=map[]
time=2026-03-25T01:44:58.069-03:00 level=INFO source=server.go:432 msg="starting runner" cmd="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\ollama.exe runner --ollama-engine --port 54384"
time=2026-03-25T01:44:58.070-03:00 level=DEBUG source=server.go:433 msg=subprocess OLLAMA_DEBUG=1 OLLAMA_HOST=127.0.0.1:11434 OLLAMA_ID=C:\Users\Lukinhas\.ollama\id_ed25519 OLLAMA_NO_CLOUD=false PATH="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\rocm;C:\\jdk-21.0.10\\bin;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\WINDOWS\\System32\\WindowsPowerShell\\v1.0\\;C:\\WINDOWS\\System32\\OpenSSH\\;C:\\Program Files\\Git\\cmd;C:\\Exercism;C:\\Program Files\\dotnet\\;C:\\Program Files\\nodejs\\;C:\\apache-maven-3.9.12\\bin;;C:\\Program Files\\Docker\\Docker\\resources\\bin;C:\\jdk-21.0.10\\bin;C:\\Users\\Lukinhas\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Microsoft VS Code\\bin;C:\\MinGW\\bin;C:\\Exercism;C:\\Users\\Lukinhas\\AppData\\Roaming\\npm;;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama" OLLAMA_LIBRARY_PATH=C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama\rocm
time=2026-03-25T01:44:58.434-03:00 level=DEBUG source=runner.go:437 msg="bootstrap discovery took" duration=368.4025ms OLLAMA_LIBRARY_PATH="[C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\rocm]" extra_envs=map[]
time=2026-03-25T01:44:58.434-03:00 level=INFO source=runner.go:106 msg="experimental Vulkan support disabled.  To enable, set OLLAMA_VULKAN=1"
time=2026-03-25T01:44:58.435-03:00 level=DEBUG source=runner.go:124 msg="evaluating which, if any, devices to filter out" initial_count=0
time=2026-03-25T01:44:58.435-03:00 level=DEBUG source=runner.go:40 msg="GPU bootstrap discovery took" duration=1.3133643s
time=2026-03-25T01:44:58.435-03:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="15.8 GiB" available="8.0 GiB"
time=2026-03-25T01:44:58.436-03:00 level=INFO source=routes.go:1848 msg="vram-based default context" total_vram="0 B" default_num_ctx=4096
[GIN] 2026/03/25 - 01:45:15 | 200 |            0s |       127.0.0.1 | HEAD     "/"
[GIN] 2026/03/25 - 01:45:15 | 404 |      1.3235ms |       127.0.0.1 | POST     "/api/show"
[GIN] 2026/03/25 - 01:45:15 | 200 |    513.8069ms |       127.0.0.1 | POST     "/api/pull"

OS

Windows

GPU

Intel

CPU

Intel

Ollama version

0.18.2

Originally created by @LucasPavs on GitHub (Mar 26, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15074 ### What is the issue? Describe the bug I am unable to authenticate my Ollama client on Windows 11. When I run ollama login, nothing happens (no browser opens). When I try to pull or run a cloud model (like minimax-m2.7:cloud or even llama3.2:1b), I get an Error: pull model manifest: 401. To Reproduce Install Ollama v0.18.2 on Windows 11. Generate a new SSH key: cat ~/.ollama/id_ed25519.pub. Add the public key manually to ollama.com/settings/keys. Run ollama run llama3.2:1b or ollama run minimax-m2.7:cloud. Command fails with 401 Unauthorized. Expected behavior The client should recognize the public key registered on the website and allow the pull, or ollama login should open the browser to verify the session. Environment: OS: Windows 11 Ollama Version: 0.18.2 Network: Brazilian ISP (tested with local server at 127.0.0.1:11434) Additional Context: I have already tried resetting the .ollama folder and regenerating keys. OLLAMA_HOST is set to 127.0.0.1:11434. The URL ollama.com/authorize (often triggered by login) returns a 404 error on the website. Pulling from hf.co works fine, but official registry pulls fail with 401. Server Debug Logs: Even after adding the public key to my account on ollama.com, I get a 401 error. The server log shows that the /api/pull request is received, but it fails to authenticate with the remote registry. My OLLAMA_ID is set to: C:\Users\Lukinhas\.ollama\id_ed25519 My OLLAMA_HOST is: 127.0.0.1:11434 ### Relevant log output ```shell time=2026-03-25T01:44:57.118-03:00 level=INFO source=routes.go:1740 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:0 OLLAMA_DEBUG:DEBUG OLLAMA_DEBUG_LOG_REQUESTS:false OLLAMA_EDITOR: OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\\Users\\Lukinhas\\.ollama\\models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NO_CLOUD:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false OLLAMA_VULKAN:false ROCR_VISIBLE_DEVICES:]" time=2026-03-25T01:44:57.118-03:00 level=INFO source=routes.go:1742 msg="Ollama cloud disabled: false" time=2026-03-25T01:44:57.119-03:00 level=INFO source=images.go:477 msg="total blobs: 0" time=2026-03-25T01:44:57.119-03:00 level=INFO source=images.go:484 msg="total unused blobs removed: 0" time=2026-03-25T01:44:57.121-03:00 level=INFO source=routes.go:1798 msg="Listening on 127.0.0.1:11434 (version 0.18.3)" time=2026-03-25T01:44:57.122-03:00 level=DEBUG source=sched.go:145 msg="starting llm scheduler" time=2026-03-25T01:44:57.123-03:00 level=INFO source=runner.go:67 msg="discovering available GPUs..." time=2026-03-25T01:44:57.137-03:00 level=INFO source=server.go:432 msg="starting runner" cmd="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\ollama.exe runner --ollama-engine --port 54365" time=2026-03-25T01:44:57.137-03:00 level=DEBUG source=server.go:433 msg=subprocess OLLAMA_DEBUG=1 OLLAMA_HOST=127.0.0.1:11434 OLLAMA_ID=C:\Users\Lukinhas\.ollama\id_ed25519 OLLAMA_NO_CLOUD=false PATH="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\cuda_v12;C:\\jdk-21.0.10\\bin;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\WINDOWS\\System32\\WindowsPowerShell\\v1.0\\;C:\\WINDOWS\\System32\\OpenSSH\\;C:\\Program Files\\Git\\cmd;C:\\Exercism;C:\\Program Files\\dotnet\\;C:\\Program Files\\nodejs\\;C:\\apache-maven-3.9.12\\bin;;C:\\Program Files\\Docker\\Docker\\resources\\bin;C:\\jdk-21.0.10\\bin;C:\\Users\\Lukinhas\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Microsoft VS Code\\bin;C:\\MinGW\\bin;C:\\Exercism;C:\\Users\\Lukinhas\\AppData\\Roaming\\npm;;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama" OLLAMA_LIBRARY_PATH=C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12 time=2026-03-25T01:44:57.664-03:00 level=DEBUG source=runner.go:437 msg="bootstrap discovery took" duration=535.23ms OLLAMA_LIBRARY_PATH="[C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\cuda_v12]" extra_envs=map[] time=2026-03-25T01:44:57.667-03:00 level=INFO source=server.go:432 msg="starting runner" cmd="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\ollama.exe runner --ollama-engine --port 54374" time=2026-03-25T01:44:57.668-03:00 level=DEBUG source=server.go:433 msg=subprocess OLLAMA_DEBUG=1 OLLAMA_HOST=127.0.0.1:11434 OLLAMA_ID=C:\Users\Lukinhas\.ollama\id_ed25519 OLLAMA_NO_CLOUD=false PATH="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\cuda_v13;C:\\jdk-21.0.10\\bin;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\WINDOWS\\System32\\WindowsPowerShell\\v1.0\\;C:\\WINDOWS\\System32\\OpenSSH\\;C:\\Program Files\\Git\\cmd;C:\\Exercism;C:\\Program Files\\dotnet\\;C:\\Program Files\\nodejs\\;C:\\apache-maven-3.9.12\\bin;;C:\\Program Files\\Docker\\Docker\\resources\\bin;C:\\jdk-21.0.10\\bin;C:\\Users\\Lukinhas\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Microsoft VS Code\\bin;C:\\MinGW\\bin;C:\\Exercism;C:\\Users\\Lukinhas\\AppData\\Roaming\\npm;;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama" OLLAMA_LIBRARY_PATH=C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13 time=2026-03-25T01:44:58.066-03:00 level=DEBUG source=runner.go:437 msg="bootstrap discovery took" duration=401.7361ms OLLAMA_LIBRARY_PATH="[C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\cuda_v13]" extra_envs=map[] time=2026-03-25T01:44:58.069-03:00 level=INFO source=server.go:432 msg="starting runner" cmd="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\ollama.exe runner --ollama-engine --port 54384" time=2026-03-25T01:44:58.070-03:00 level=DEBUG source=server.go:433 msg=subprocess OLLAMA_DEBUG=1 OLLAMA_HOST=127.0.0.1:11434 OLLAMA_ID=C:\Users\Lukinhas\.ollama\id_ed25519 OLLAMA_NO_CLOUD=false PATH="C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\rocm;C:\\jdk-21.0.10\\bin;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\WINDOWS\\System32\\WindowsPowerShell\\v1.0\\;C:\\WINDOWS\\System32\\OpenSSH\\;C:\\Program Files\\Git\\cmd;C:\\Exercism;C:\\Program Files\\dotnet\\;C:\\Program Files\\nodejs\\;C:\\apache-maven-3.9.12\\bin;;C:\\Program Files\\Docker\\Docker\\resources\\bin;C:\\jdk-21.0.10\\bin;C:\\Users\\Lukinhas\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Microsoft VS Code\\bin;C:\\MinGW\\bin;C:\\Exercism;C:\\Users\\Lukinhas\\AppData\\Roaming\\npm;;C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama" OLLAMA_LIBRARY_PATH=C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\Lukinhas\AppData\Local\Programs\Ollama\lib\ollama\rocm time=2026-03-25T01:44:58.434-03:00 level=DEBUG source=runner.go:437 msg="bootstrap discovery took" duration=368.4025ms OLLAMA_LIBRARY_PATH="[C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama C:\\Users\\Lukinhas\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\rocm]" extra_envs=map[] time=2026-03-25T01:44:58.434-03:00 level=INFO source=runner.go:106 msg="experimental Vulkan support disabled. To enable, set OLLAMA_VULKAN=1" time=2026-03-25T01:44:58.435-03:00 level=DEBUG source=runner.go:124 msg="evaluating which, if any, devices to filter out" initial_count=0 time=2026-03-25T01:44:58.435-03:00 level=DEBUG source=runner.go:40 msg="GPU bootstrap discovery took" duration=1.3133643s time=2026-03-25T01:44:58.435-03:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="15.8 GiB" available="8.0 GiB" time=2026-03-25T01:44:58.436-03:00 level=INFO source=routes.go:1848 msg="vram-based default context" total_vram="0 B" default_num_ctx=4096 [GIN] 2026/03/25 - 01:45:15 | 200 | 0s | 127.0.0.1 | HEAD "/" [GIN] 2026/03/25 - 01:45:15 | 404 | 1.3235ms | 127.0.0.1 | POST "/api/show" [GIN] 2026/03/25 - 01:45:15 | 200 | 513.8069ms | 127.0.0.1 | POST "/api/pull" ``` ### OS Windows ### GPU Intel ### CPU Intel ### Ollama version 0.18.2
GiteaMirror added the bug label 2026-04-12 22:33:32 -05:00
Author
Owner

@BruceMacD commented on GitHub (Mar 26, 2026):

Hi @LucasPavs sorry for the issues. I think you're running into a few bugs here, and I suspect it's due to the manually generating the ollama key pair.

I'd suggest removing the ollama keys and letting the server regenerate them.

  1. Remove the current ollama keys
rm ~/.ollama/id_ed25519*
  1. Restart the ollama server by quitting and reopening Ollama from the system tray / Start menu. The key generation happens at server startup, so restarting the server would regenerate the key if ~/.ollama/id_ed25519 is missing. Do not generate the ollama key manually.
  2. You should now be able to pull models and sign in.
<!-- gh-comment-id:4137975631 --> @BruceMacD commented on GitHub (Mar 26, 2026): Hi @LucasPavs sorry for the issues. I think you're running into a few bugs here, and I suspect it's due to the manually generating the ollama key pair. I'd suggest removing the ollama keys and letting the server regenerate them. 1. Remove the current ollama keys ```bash rm ~/.ollama/id_ed25519* ``` 2. Restart the ollama server by quitting and reopening Ollama from the system tray / Start menu. The key generation happens at server startup, so restarting the server would regenerate the key if ~/.ollama/id_ed25519 is missing. Do not generate the ollama key manually. 3. You should now be able to pull models and sign in.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#9670