[GH-ISSUE #12007] Windows GUI: "No model found" with a "runtime error: invalid memory address or nil pointer dereference" when loading models (v0.11.4) #33733

Open
opened 2026-04-22 16:41:35 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @mastrupikenzu on GitHub (Aug 21, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12007

Originally assigned to: @jmorganca on GitHub.

What is the issue?

I'm encountering an issue with the official Ollama GUI on Windows 10 where the dropdown menu for models shows "No model found," even though I have three models installed and they are listed correctly when I run ollama list from the command line.

The server is running correctly and responds to http://localhost:11434.

I performed a clean reinstallation, including manually deleting the .ollama folders from my user directory, but the problem persists.

My computer is behind a corporate firewall, which may be related to some of the PROTOCOL_ERROR messages visible in the log, but this does not explain the core panic error when the GUI attempts to load the models.

I believe this is a bug in the GUI itself, as confirmed by the log, which shows a critical runtime error.

Expected Behavior
The dropdown menu should display the list of installed models.

Current Behavior
The dropdown menu is empty, and the message "No model found" is displayed. The log file shows a panic error when the GUI attempts to load the models.

The only way to use the GUI is to set the Airplane mode

Relevant log output

time=2025-08-20T16:26:08.167+02:00 level=INFO source=app.go:229 msg="starting ollama server"
time=2025-08-20T16:26:08.167+02:00 level=INFO source=auth.go:69 msg="Failed to load private key: open C:\\Users\\<user>\\.ollama\\id_ed25519: Impossibile trovare il percorso specificato."
time=2025-08-20T16:26:08.167+02:00 level=WARN source=app.go:253 msg="failed to load user data" error="failed to call ollama.com/api/me: failed to sign request: open C:\\Users\\<user>\\.ollama\\id_ed25519: Impossibile trovare il percorso specificato."
...
time=2025-08-20T16:26:10.010+02:00 level=ERROR source=ui.go:166 msg=site.serveHTTP panic="runtime error: invalid memory address or nil pointer dereference" request_id=1755699969893407700 http.method=GET http.path=/api/v1/models http.pattern="GET /api/v1/models" http.status=500 http.d=117.1488ms request_id=1755699969893407700 version=0.11.4
time=2025-08-20T16:26:10.010+02:00 level=INFO source=.:0 msg="http: panic serving 127.0.0.1:53594: runtime error: invalid memory address or nil pointer dereference\ngoroutine 50 [running]:
... (stack trace continues)

OS

Windows

GPU

Nvidia

CPU

AMD

Ollama version

0.11.4

Originally created by @mastrupikenzu on GitHub (Aug 21, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12007 Originally assigned to: @jmorganca on GitHub. ### What is the issue? I'm encountering an issue with the official Ollama GUI on Windows 10 where the dropdown menu for models shows "No model found," even though I have three models installed and they are listed correctly when I run ollama list from the command line. The server is running correctly and responds to _http://localhost:11434_. I performed a clean reinstallation, including manually deleting the .ollama folders from my user directory, but the problem persists. My computer is behind a corporate firewall, which may be related to some of the PROTOCOL_ERROR messages visible in the log, but this does not explain the core panic error when the GUI attempts to load the models. I believe this is a bug in the GUI itself, as confirmed by the log, which shows a critical runtime error. Expected Behavior The dropdown menu should display the list of installed models. Current Behavior The dropdown menu is empty, and the message "No model found" is displayed. The log file shows a panic error when the GUI attempts to load the models. The only way to use the GUI is to set the Airplane mode ### Relevant log output ```shell time=2025-08-20T16:26:08.167+02:00 level=INFO source=app.go:229 msg="starting ollama server" time=2025-08-20T16:26:08.167+02:00 level=INFO source=auth.go:69 msg="Failed to load private key: open C:\\Users\\<user>\\.ollama\\id_ed25519: Impossibile trovare il percorso specificato." time=2025-08-20T16:26:08.167+02:00 level=WARN source=app.go:253 msg="failed to load user data" error="failed to call ollama.com/api/me: failed to sign request: open C:\\Users\\<user>\\.ollama\\id_ed25519: Impossibile trovare il percorso specificato." ... time=2025-08-20T16:26:10.010+02:00 level=ERROR source=ui.go:166 msg=site.serveHTTP panic="runtime error: invalid memory address or nil pointer dereference" request_id=1755699969893407700 http.method=GET http.path=/api/v1/models http.pattern="GET /api/v1/models" http.status=500 http.d=117.1488ms request_id=1755699969893407700 version=0.11.4 time=2025-08-20T16:26:10.010+02:00 level=INFO source=.:0 msg="http: panic serving 127.0.0.1:53594: runtime error: invalid memory address or nil pointer dereference\ngoroutine 50 [running]: ... (stack trace continues) ``` ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.11.4
GiteaMirror added the appbug labels 2026-04-22 16:41:35 -05:00
Author
Owner

@grester commented on GitHub (Aug 29, 2025):

I also have this problem in Win Server 2019. I'll try get my logs as well.
Side note: Possible chain of related incidents if lack of permissions to read some kind of data is the culprit #12050 #12112

<!-- gh-comment-id:3236359736 --> @grester commented on GitHub (Aug 29, 2025): I also have this problem in Win Server 2019. I'll try get my logs as well. Side note: Possible chain of related incidents if lack of permissions to read some kind of data is the culprit #12050 #12112
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#33733