[GH-ISSUE #13923] Ollama Stuck on "Loading..." in Mac App #9110

Open
opened 2026-04-12 21:57:46 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @d-shehu on GitHub (Jan 27, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/13923

What is the issue?

Recently, I've noticed Ollama on my Mac getting stuck in "Loading ..." and unresponsive. I complete uninstalled and removed all ollama folders my home directory as well all other folders called ollama on my system.

I rebooted and installed v 0.15.2. Everything worked fine until I reloaded it again this morning. The problem has resurfaced. The CLI commands such as ollama list or ollama run ... are likewise unresponsive.

However, if I run a query over the API or use it with OpenWebUI it seems to work fine. This is a brand new issue I've never seen before. No other issues on this system and plenty of free memory, etc.

Mac OS 26.1 / M3 Pro / 18GB of RAM.

Image

Relevant log output


OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.15.2

Originally created by @d-shehu on GitHub (Jan 27, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/13923 ### What is the issue? Recently, I've noticed Ollama on my Mac getting stuck in "Loading ..." and unresponsive. I complete uninstalled and removed all ollama folders my home directory as well all other folders called ollama on my system. I rebooted and installed v 0.15.2. Everything worked fine until I reloaded it again this morning. The problem has resurfaced. The CLI commands such as ollama list or ollama run ... are likewise unresponsive. However, if I run a query over the API or use it with OpenWebUI it seems to work fine. This is a brand new issue I've never seen before. No other issues on this system and plenty of free memory, etc. Mac OS 26.1 / M3 Pro / 18GB of RAM. <img width="783" height="585" alt="Image" src="https://github.com/user-attachments/assets/ae95b5f1-de4f-4f3e-89dd-b5c1ef541599" /> ### Relevant log output ```shell ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.15.2
GiteaMirror added the bug label 2026-04-12 21:57:47 -05:00
Author
Owner

@ricruan commented on GitHub (Jan 27, 2026):

Is it caused by the version?

<!-- gh-comment-id:3803845024 --> @ricruan commented on GitHub (Jan 27, 2026): Is it caused by the version?
Author
Owner

@albizrik commented on GitHub (Jan 28, 2026):

post edited while investigating

ok i fixed it i had an environnement variable OLLAMA_HOST=192.168.x.x ( a fixed IP i USED to assign to that computer ... )

<!-- gh-comment-id:3812829010 --> @albizrik commented on GitHub (Jan 28, 2026): post edited while investigating ok i fixed it i had an environnement variable OLLAMA_HOST=192.168.x.x ( a fixed IP i USED to assign to that computer ... )
Author
Owner

@d-shehu commented on GitHub (Jan 28, 2026):

Both the previous and the new version produced the same result. I had previously installed it via Brew and never saw this issue. I updated via the Ollama desktop menu bar and I saw the issue immediately.

I purged Ollama and did a clean install using DMG from Ollama's site. Same issue. But the web server handled requests just fine to OpenWebUI.

Today, I restarted my Mac and I was playing around with environment variables. I couldn't get Ollama to use launchctl setenv. So I defaulted to just passing the env variables on the command line OLLAMA_KEEP_ALIVE= ..., etc and ran "ollama serve".

When I opened up the frontend, it was now working. I know Ollama was bound to 0.0.0.0 correctly because it was accessible over the network. However, I do wonder if the issue is related somehow to env variables.

I'd like to keep this open as it's very strange for the UI to freeze up indefinitely with a default install especially since it had been working through Brew for 1+ year and I'd updated it on a regular basis via the desktop menu without any issues.

<!-- gh-comment-id:3813595473 --> @d-shehu commented on GitHub (Jan 28, 2026): Both the previous and the new version produced the same result. I had previously installed it via Brew and never saw this issue. I updated via the Ollama desktop menu bar and I saw the issue immediately. I purged Ollama and did a clean install using DMG from Ollama's site. Same issue. But the web server handled requests just fine to OpenWebUI. Today, I restarted my Mac and I was playing around with environment variables. I couldn't get Ollama to use launchctl setenv. So I defaulted to just passing the env variables on the command line OLLAMA_KEEP_ALIVE= ..., etc and ran "ollama serve". When I opened up the frontend, it was now working. I know Ollama was bound to 0.0.0.0 correctly because it was accessible over the network. However, I do wonder if the issue is related somehow to env variables. I'd like to keep this open as it's very strange for the UI to freeze up indefinitely with a default install especially since it had been working through Brew for 1+ year and I'd updated it on a regular basis via the desktop menu without any issues.
Author
Owner

@bkowens commented on GitHub (Feb 27, 2026):

I have been running into this problem as well, the last six versions have this problem. Removed every reference to Ollama then tried to re-install with the last version that worked. STILL stuck at loading. Mind you the ollama command line works fine, shows and loads models, it's the Ollama gui app that's broken.

<!-- gh-comment-id:3973491741 --> @bkowens commented on GitHub (Feb 27, 2026): I have been running into this problem as well, the last six versions have this problem. Removed every reference to Ollama then tried to re-install with the last version that worked. STILL stuck at loading. Mind you the ollama command line works fine, shows and loads models, it's the Ollama gui app that's broken.
Author
Owner

@Nicolas-Delahaie commented on GitHub (Mar 6, 2026):

I had the exact same issue: models keep unloading in the Ollama UI and commands just spin endlessly.
This fixed it for me (on macOS):

  1. First, check the server logs:

    tail -f ~/.ollama/logs/server.log
    
  2. If you see “port 11434 already in use” or bind errors:

    lsof -i :11434
    
  3. Note the PID(s), then:

    kill -9 <PID>
    
  4. Restart the server:

     ollama serve
    
<!-- gh-comment-id:4012119458 --> @Nicolas-Delahaie commented on GitHub (Mar 6, 2026): I had the exact same issue: models keep unloading in the Ollama UI and commands just spin endlessly. This fixed it for me (on macOS): 1. First, check the server logs: ``` tail -f ~/.ollama/logs/server.log ``` 2. If you see “port 11434 already in use” or bind errors: ``` lsof -i :11434 ``` 3. Note the PID(s), then: ``` kill -9 <PID> ``` 4. Restart the server: ``` ollama serve ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#9110