[GH-ISSUE #14010] Cannot access Ollama settings Win10 22H2 & http://localhost:11434/api #9157

Closed
opened 2026-04-12 22:00:25 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @jekv2 on GitHub (Feb 1, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14010

When I click Ollama settings, nothing happens, then the popup menu for the icon does not popup at all.
Same when trying to open the app
https://github.com/ollama/ollama/issues/14000

The menu
Image

Also, I cannot access
http://localhost:11434/api
Described here
https://docs.ollama.com/api/introduction

404 page not found
Image

Originally created by @jekv2 on GitHub (Feb 1, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14010 When I click Ollama settings, nothing happens, then the popup menu for the icon does not popup at all. Same when trying to open the app https://github.com/ollama/ollama/issues/14000 The menu <img width="182" height="114" alt="Image" src="https://github.com/user-attachments/assets/cbc82fa9-2f36-4142-8ca2-0705a7a67f33" /> Also, I cannot access http://localhost:11434/api Described here https://docs.ollama.com/api/introduction 404 page not found <img width="450" height="169" alt="Image" src="https://github.com/user-attachments/assets/7356b86b-8be7-46c0-b5e8-a7b1d7358b91" />
Author
Owner

@jekv2 commented on GitHub (Feb 1, 2026):

Do I need javascript installed on win10?

<!-- gh-comment-id:3831177895 --> @jekv2 commented on GitHub (Feb 1, 2026): Do I need javascript installed on win10?
Author
Owner

@rick-github commented on GitHub (Feb 1, 2026):

Also, I cannot access http://localhost:11434/api

What happens if you run curl localhost:11434 from a terminal session?

Do I need javascript installed on win10?

Not to run ollama. If you have clients built to use JavaScript, then yes.

<!-- gh-comment-id:3831252351 --> @rick-github commented on GitHub (Feb 1, 2026): > Also, I cannot access http://localhost:11434/api What happens if you run `curl localhost:11434` from a terminal session? > Do I need javascript installed on win10? Not to run ollama. If you have clients built to use JavaScript, then yes.
Author
Owner

@jekv2 commented on GitHub (Feb 1, 2026):

Also, I cannot access http://localhost:11434/api

What happens if you run curl localhost:11434 from a terminal session?

Do I need javascript installed on win10?

Not to run ollama. If you have clients built to use JavaScript, then yes.

It is running, thing won't open settings or open ollama after either is clicked and after either is clicked, the popup menu does not show in the taskbar notification icon. Must kill both processes in task manager. Reopen it, rinse repeat although, first click of open logs works.

C:\Users\Admin>curl localhost:11434
Ollama is running
C:\Users\Admin>

Image
<!-- gh-comment-id:3831374061 --> @jekv2 commented on GitHub (Feb 1, 2026): > > Also, I cannot access http://localhost:11434/api > > What happens if you run `curl localhost:11434` from a terminal session? > > > Do I need javascript installed on win10? > > Not to run ollama. If you have clients built to use JavaScript, then yes. It is running, thing won't open settings or open ollama after either is clicked and after either is clicked, the popup menu does not show in the taskbar notification icon. Must kill both processes in task manager. Reopen it, rinse repeat although, first click of open logs works. C:\Users\Admin>curl localhost:11434 Ollama is running C:\Users\Admin> <img width="760" height="170" alt="Image" src="https://github.com/user-attachments/assets/2d66654a-1f63-4803-98bd-a9e8a8615145" />
Author
Owner

@rick-github commented on GitHub (Feb 1, 2026):

The app.log and server.log files from the logs directory may help with debugging.

<!-- gh-comment-id:3831379866 --> @rick-github commented on GitHub (Feb 1, 2026): The app.log and server.log files from the [logs directory](https://docs.ollama.com/troubleshooting) may help with debugging.
Author
Owner

@jekv2 commented on GitHub (Feb 1, 2026):

The app.log and server.log files from the logs directory may help with debugging.

app.log

server-5.log

<!-- gh-comment-id:3831389814 --> @jekv2 commented on GitHub (Feb 1, 2026): <!-- Failed to upload "Untitled6.png" --> > The app.log and server.log files from the [logs directory](https://docs.ollama.com/troubleshooting) may help with debugging. [app.log](https://github.com/user-attachments/files/24994240/app.log) [server-5.log](https://github.com/user-attachments/files/24994241/server-5.log)
Author
Owner

@jekv2 commented on GitHub (Feb 1, 2026):

I need to access Ollama's settings.

Ollama is utilizing both cpu and gpu, Yesterday when first installed and was using ollama, my gpu be 100% and AI model really fast, today it gets maybe max to 38% GPU usage and then taxes my cpu with AI being extremely slow.

Image
<!-- gh-comment-id:3831437428 --> @jekv2 commented on GitHub (Feb 1, 2026): I need to access Ollama's settings. Ollama is utilizing both cpu and gpu, Yesterday when first installed and was using ollama, my gpu be 100% and AI model really fast, today it gets maybe max to 38% GPU usage and then taxes my cpu with AI being extremely slow. <img width="1313" height="221" alt="Image" src="https://github.com/user-attachments/assets/aace24c6-02a5-4451-8a26-9c8ce7e1e4cb" />
Author
Owner

@rick-github commented on GitHub (Feb 1, 2026):

Ollama is utilizing both cpu and gpu, Yesterday when first installed and was using ollama, my gpu be 100% and AI model really fast, today it gets maybe max to 38% GPU usage and then taxes my cpu with AI being extremely slow.

https://github.com/ollama/ollama/issues/14002#issuecomment-3831324757

<!-- gh-comment-id:3831531452 --> @rick-github commented on GitHub (Feb 1, 2026): > Ollama is utilizing both cpu and gpu, Yesterday when first installed and was using ollama, my gpu be 100% and AI model really fast, today it gets maybe max to 38% GPU usage and then taxes my cpu with AI being extremely slow. https://github.com/ollama/ollama/issues/14002#issuecomment-3831324757
Author
Owner

@rick-github commented on GitHub (Feb 1, 2026):

In a CMD terminal session, what's the output of

set | findstr OLLAMA
<!-- gh-comment-id:3831546104 --> @rick-github commented on GitHub (Feb 1, 2026): In a CMD terminal session, what's the output of ``` set | findstr OLLAMA ```
Author
Owner

@jekv2 commented on GitHub (Feb 1, 2026):

set | findstr OLLAMA

Powershell ran as Admin
PS C:\Windows\system32> set | findstr OLLAMA

cmdlet Set-Variable at command pipeline position 1
Supply values for the following parameters:
Name[0]:

cmd promt is a blank line.
C:\Users\Admin>set | findstr OLLAMA

C:\Users\Admin>

As for GPU usage, "I am learning a bit about Ollama/Open WebUI". I went into admin tools/models/deleted all models and found out that 4b's are best for my 8GB 3060TIFE and that I tested
gemma3:4b Fast - using 99% GPU
gemma3:12b Slow

And while I downloaded a fresh gemma3:4b prior was llama3.1 8b slow and using gpu and cpu.

Sorry take up your time, is there a UI way to enable specs of retrieved information after the model finishes, of what you asked?

Thank You.

<!-- gh-comment-id:3831721068 --> @jekv2 commented on GitHub (Feb 1, 2026): > set | findstr OLLAMA Powershell ran as Admin PS C:\Windows\system32> set | findstr OLLAMA cmdlet Set-Variable at command pipeline position 1 Supply values for the following parameters: Name[0]: cmd promt is a blank line. C:\Users\Admin>set | findstr OLLAMA C:\Users\Admin> As for GPU usage, "I am learning a bit about Ollama/Open WebUI". I went into admin tools/models/deleted all models and found out that 4b's are best for my 8GB 3060TIFE and that I tested gemma3:4b Fast - using 99% GPU gemma3:12b Slow And while I downloaded a fresh gemma3:4b prior was llama3.1 8b slow and using gpu and cpu. Sorry take up your time, is there a UI way to enable specs of retrieved information after the model finishes, of what you asked? Thank You.
Author
Owner

@jekv2 commented on GitHub (Feb 4, 2026):

I believe on accident I have fixed it, I think Microsoft Edge WebView2 Runtime is needed for Ollama?

I had to reinstall it for a different app, then I ran into a windows10 problem, fired up Ollama and open webui to ask it a ?, when I launched Ollama, the window popped up. Very interesting.

Is webview2 required to run Ollama on win10 22H2?

Image
<!-- gh-comment-id:3849657175 --> @jekv2 commented on GitHub (Feb 4, 2026): I believe on accident I have fixed it, I think Microsoft Edge WebView2 Runtime is needed for Ollama? I had to reinstall it for a different app, then I ran into a windows10 problem, fired up Ollama and open webui to ask it a ?, when I launched Ollama, the window popped up. Very interesting. Is webview2 required to run Ollama on win10 22H2? <img width="802" height="632" alt="Image" src="https://github.com/user-attachments/assets/52572f25-982f-40ca-8a3f-7d29186071be" />
Author
Owner

@rick-github commented on GitHub (Feb 4, 2026):

Is webview2 required to run Ollama on win10 22H2?

It looks like it, a similar issue was resolved by installing webview2.

<!-- gh-comment-id:3850239498 --> @rick-github commented on GitHub (Feb 4, 2026): > Is webview2 required to run Ollama on win10 22H2? It looks like it, a [similar issue](https://github.com/ollama/ollama/issues/12050#issuecomment-3262353425) was resolved by installing webview2.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#9157