[GH-ISSUE #12165] Mac OS app is 100% useless #8089

Open
opened 2026-04-12 20:23:06 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @khorn-infomedia on GitHub (Sep 3, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12165

Is the Mac OS Ollama GUI a practical joke ?

I have downloaded the Ollama Mac OS app, after using the CLI version for a long while. The app sat there with three dots blinking for 2 days ? WTF.
I asked it what model you are. It says Chat-GPT-4. Nothing else telsl me what model is loaded.
I can find no explanation of what the UI is and what the buttons and drop downs do ?

I selected model things from the drop down, and it spent one day downloading a model, but then it will not run the model. I have no idea if it downloaded or not. There is no icon saying it's there or not. How about using AI to tell me WTF the app is doing. What is the up arrow button mean ?

When I select a model like qwen3:8b the app does nothing. No message. no feedback.
Do you have to sign in to some secret account to make the app function ? A User Interface Failure.

While it was downloading, the app was useless. It would not accept prompts !!!!!
How about downloading in the background ?
This has to be the worst application I have even seen. Deleting it. Ollama was great now it's useless.

What is the interface for. No explanation of what the drop down model list is; what are these models listed there. it does not align to Ollama web site models.

I give up. it is completely useless and unusable. Someone is taking too many drugs. Is this a practical joke ?

I am wondering if the Ollama team can write a manual for how they think there app should function, because Its a mystery to me. Maybe it was written by AI, and ollama team was not informed what the app does ?

Yes the logs, sorry but if you write an application that requires the user to mine logs then the GUI is a total failure. At least add in hover text to describe what the random buttons do.

I by chance found that I can go back to a CLI version of Ollama, I wonder if that works or is now a total mess ?

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

0.11.8

Originally created by @khorn-infomedia on GitHub (Sep 3, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12165 ### Is the Mac OS Ollama GUI a practical joke ? I have downloaded the Ollama Mac OS app, after using the CLI version for a long while. The app sat there with three dots blinking for 2 days ? WTF. I asked it what model you are. It says Chat-GPT-4. Nothing else telsl me what model is loaded. I can find no explanation of what the UI is and what the buttons and drop downs do ? I selected model things from the drop down, and it spent one day downloading a model, but then it will not run the model. I have no idea if it downloaded or not. There is no icon saying it's there or not. How about using AI to tell me WTF the app is doing. What is the up arrow button mean ? When I select a model like qwen3:8b the app does nothing. No message. no feedback. Do you have to sign in to some secret account to make the app function ? A User Interface Failure. While it was downloading, the app was useless. It would not accept prompts !!!!! How about downloading in the background ? This has to be the worst application I have even seen. Deleting it. Ollama was great now it's useless. What is the interface for. No explanation of what the drop down model list is; what are these models listed there. it does not align to Ollama web site models. I give up. it is completely useless and unusable. Someone is taking too many drugs. Is this a practical joke ? I am wondering if the Ollama team can write a manual for how they think there app should function, because Its a mystery to me. Maybe it was written by AI, and ollama team was not informed what the app does ? Yes the logs, sorry but if you write an application that requires the user to mine logs then the GUI is a total failure. At least add in hover text to describe what the random buttons do. I by chance found that I can go back to a CLI version of Ollama, I wonder if that works or is now a total mess ? ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.11.8
GiteaMirror added the appneeds more infobug labels 2026-04-12 20:23:07 -05:00
Author
Owner

@ghmer commented on GitHub (Sep 3, 2025):

The app is delivered on top of the cli, no clue why you needed to “go back”.
I am on MacOS, and even I am using BoltAI as my main frontend, I haven’t seen issues while using the app.

If there is anything missing right now, for sure it is a comprehensive settings option. But otherwise, the app is a fine alternative to using the cli, at least to the ordinary user.

<!-- gh-comment-id:3249841772 --> @ghmer commented on GitHub (Sep 3, 2025): The app is delivered on top of the cli, no clue why you needed to “go back”. I am on MacOS, and even I am using BoltAI as my main frontend, I haven’t seen issues while using the app. If there is anything missing right now, for sure it is a comprehensive settings option. But otherwise, the app is a fine alternative to using the cli, at least to the ordinary user.
Author
Owner

@pdevine commented on GitHub (Sep 3, 2025):

@khorn-infomedia what type of MBP are you using, and which model was it trying to download? What I think happened was maybe you downloaded a model which was too large for your system. There definitely needs to be better feedback.

<!-- gh-comment-id:3250230892 --> @pdevine commented on GitHub (Sep 3, 2025): @khorn-infomedia what type of MBP are you using, and which model was it trying to download? What I think happened was maybe you downloaded a model which was too large for your system. There definitely needs to be better feedback.
Author
Owner

@khorn-infomedia commented on GitHub (Sep 4, 2025):

The app is delivered on top of the cli, no clue why you needed to “go back”. I am on MacOS, and even I am using BoltAI as my main frontend, I haven’t seen issues while using the app.

If there is anything missing right now, for sure it is a comprehensive settings option. But otherwise, the app is a fine alternative to using the cli, at least to the ordinary user.

So where is the documentation or description stating the GUI is on top of a CLI ? How do I know that. I was not born with that information. there were no magic fairies telling me. Where is the instruction manual. I assumed the GUI app was a Total replacement for the prior CLI I was using. Given no information or instructions or manual why would I assume otherwise? The Ollama web page is so bare it is dysfunctional. It may look trendy to have no words on the web page, but thats pretty useless for humans that use 'language'. You cannot assume users know your product 100% ?

Looking around I saw comments about an alternate download for just a CLI, the old CLI with brew being out of date, but still available. However no description of what the current stable product is, and how it works. It's a confusing mess. Discord is a dysfunctional joke.

I eventually worked out the 'thingy' had a similar old CLI, behind the scenes, and so have given up on the GUI. The GUI is a waste of space. Having users running experiments to work out what an app does, and discovering features by accident and what buttons are for, by random experiments is just silly. It is really poor UX.

The issue was not due to anything being too large for my system ,my system is massive: 48Gb Ram, M4 pro, 2TB disk . It is an issue that Internet is unreliable and the app has bugs that stop it recommencing downloads. A typical distributed system bug" "the Network is unreliable" . Looking at the logs the app is permanently attempting and failing to connect, after downloading half the model. However the end point is available, when I try and connect manually. Some weird app bug. After killing the app and starting it again it worked out how to recommence the download. No idea what mess the app got itself into. BUT how about providing some message to the USER. Just dying (or looping forever) and having no message for the user is just very bad design. Apps must provide feedback !!!!!!!

I also had to delete my old .ollama directory, from old cli, delete the new install, reinstall, and this improved functionality. Old models seem to kill the app ? How about providing some words (language) on upgrading.

If the GUI is for beginners as an alternative to the CLI it fails 100%. Having to search logs is not an alternative GUI.

We are using LLMs to speak to us in our language. This does not follow that you then remove all words and language from the app, your web site, product manuals and instructions :-)

<!-- gh-comment-id:3251246677 --> @khorn-infomedia commented on GitHub (Sep 4, 2025): > The app is delivered on top of the cli, no clue why you needed to “go back”. I am on MacOS, and even I am using BoltAI as my main frontend, I haven’t seen issues while using the app. > > If there is anything missing right now, for sure it is a comprehensive settings option. But otherwise, the app is a fine alternative to using the cli, at least to the ordinary user. So where is the documentation or description stating the GUI is on top of a CLI ? How do I know that. I was not born with that information. there were no magic fairies telling me. Where is the instruction manual. I assumed the GUI app was a Total replacement for the prior CLI I was using. Given no information or instructions or manual why would I assume otherwise? The Ollama web page is so bare it is dysfunctional. It may look trendy to have no words on the web page, but thats pretty useless for humans that use 'language'. You cannot assume users know your product 100% ? Looking around I saw comments about an alternate download for just a CLI, the old CLI with brew being out of date, but still available. However no description of what the current stable product is, and how it works. It's a confusing mess. Discord is a dysfunctional joke. I eventually worked out the 'thingy' had a similar old CLI, behind the scenes, and so have given up on the GUI. The GUI is a waste of space. Having users running experiments to work out what an app does, and discovering features by accident and what buttons are for, by random experiments is just silly. It is really poor UX. The issue was not due to anything being too large for my system ,my system is massive: 48Gb Ram, M4 pro, 2TB disk . It is an issue that Internet is unreliable and the app has bugs that stop it recommencing downloads. A typical distributed system bug" "the Network is unreliable" . Looking at the logs the app is permanently attempting and failing to connect, after downloading half the model. However the end point is available, when I try and connect manually. Some weird app bug. After killing the app and starting it again it worked out how to recommence the download. No idea what mess the app got itself into. BUT how about providing some message to the USER. Just dying (or looping forever) and having no message for the user is just very bad design. Apps must provide feedback !!!!!!! I also had to delete my old .ollama directory, from old cli, delete the new install, reinstall, and this improved functionality. Old models seem to kill the app ? How about providing some words (language) on upgrading. If the GUI is for beginners as an alternative to the CLI it fails 100%. Having to search logs is not an alternative GUI. We are using LLMs to speak to us in our language. This does not follow that you then remove all words and language from the app, your web site, product manuals and instructions :-)
Author
Owner

@ghmer commented on GitHub (Sep 4, 2025):

So where is the documentation or description stating the GUI is on top of a CLI ?

https://github.com/ollama/ollama/blob/main/docs/macos.md

<!-- gh-comment-id:3252469130 --> @ghmer commented on GitHub (Sep 4, 2025): > So where is the documentation or description stating the GUI is on top of a CLI ? https://github.com/ollama/ollama/blob/main/docs/macos.md
Author
Owner

@khorn-infomedia commented on GitHub (Sep 5, 2025):

I think this just proves the point. Users download the installer, from main web page, run it, and the app is 'functioning'. Why would they go to an obscure page like this, down levels in Git, that apply to a Macos install only and not a General User Manual ?

Anyhow if you want to make users life harder, then I agree with you.

The point is when you develop something, how it works and what it does, where the obscure user text is, is obvious to you, but not to anyone else.

<!-- gh-comment-id:3257073732 --> @khorn-infomedia commented on GitHub (Sep 5, 2025): I think this just proves the point. Users download the installer, from main web page, run it, and the app is 'functioning'. Why would they go to an obscure page like this, down levels in Git, that apply to a Macos install only and not a General User Manual ? Anyhow if you want to make users life harder, then I agree with you. The point is when you develop something, how it works and what it does, where the obscure user text is, is obvious to you, but not to anyone else.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8089