[GH-ISSUE #11609] Cross Platform GUI App #33427

Open
opened 2026-04-22 16:04:47 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @trymeouteh on GitHub (Jul 31, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11609

Please create a GUI app for these platforms to easily download and manage LLMs on any platforms without having to use a CLI.

  • Windows
  • MacOS
  • Linux
  • Android
  • iOS

I would suggest using Flutter or Tauri which allows you to create apps on desktop, Android and iOS with one codebase, making it easier to manage.

Originally created by @trymeouteh on GitHub (Jul 31, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11609 Please create a GUI app for these platforms to easily download and manage LLMs on any platforms without having to use a CLI. - ~Windows~ - ~MacOS~ - Linux - Android - iOS I would suggest using Flutter or Tauri which allows you to create apps on desktop, Android and iOS with one codebase, making it easier to manage.
GiteaMirror added the feature request label 2026-04-22 16:04:47 -05:00
Author
Owner

@FringeNet commented on GitHub (Jul 31, 2025):

Ollama App
Swift Chat
Ollama Server

Typical use of ollama involves building your own software around the REST API, which leaves the burden on the developer using Ollama to handle pulling and deleting models.

<!-- gh-comment-id:3140912808 --> @FringeNet commented on GitHub (Jul 31, 2025): [Ollama App](https://github.com/JHubi1/ollama-app) [Swift Chat](https://github.com/aws-samples/swift-chat) [Ollama Server](https://github.com/sunshine0523/OllamaServer) Typical use of ollama involves building your own software around the REST API, which leaves the burden on the developer using Ollama to handle pulling and deleting models.
Author
Owner

@seboss666 commented on GitHub (Aug 3, 2025):

And yet they built their own Windows and MacOS app, so at least on desktop, I don't see why Linux should not be considered.

It's a different matter on mobile where, yes, more advanced apps are already usable to plug to remote ollama instances, I've even tested some of them to run local models (on a Fairphone 4 it's a pretty rude experience).

<!-- gh-comment-id:3148292749 --> @seboss666 commented on GitHub (Aug 3, 2025): And yet they built their own Windows and MacOS app, so at least on desktop, I don't see why Linux should not be considered. It's a different matter on mobile where, yes, more advanced apps are already usable to plug to remote ollama instances, I've even tested some of them to run local models (on a Fairphone 4 it's a pretty rude experience).
Author
Owner

@FringeNet commented on GitHub (Aug 3, 2025):

I don't see why Linux should not be considered.

As a Linux user, I don't see a point in having a GUI for it.

<!-- gh-comment-id:3148295264 --> @FringeNet commented on GitHub (Aug 3, 2025): > I don't see why Linux should not be considered. As a Linux user, I don't see a point in having a GUI for it.
Author
Owner

@seboss666 commented on GitHub (Aug 3, 2025):

I don't see why Linux should not be considered.

As a Linux user, I don't see a point in having a GUI for it.

You have powershell on Windows and Terminal in MacOS, so yes, no need for GUI on those too. And yet...

<!-- gh-comment-id:3148300091 --> @seboss666 commented on GitHub (Aug 3, 2025): > > I don't see why Linux should not be considered. > > As a Linux user, I don't see a point in having a GUI for it. You have powershell on Windows and Terminal in MacOS, so yes, no need for GUI on those too. And yet...
Author
Owner

@trymeouteh commented on GitHub (Aug 3, 2025):

More devices are getting NPUs including mobile devices which means smaller LLMs could run on phones in the near future. To be able to install LLMs with Ollama and use Ollama on a phone would be great and a GUI app will be needed to manage the LLMs on a mobile device.

I am glad they made a Windows and MacOS GUI app but I would recommend they make a cross platform app to support as many devices as possible.

<!-- gh-comment-id:3148653713 --> @trymeouteh commented on GitHub (Aug 3, 2025): More devices are getting NPUs including mobile devices which means smaller LLMs could run on phones in the near future. To be able to install LLMs with Ollama and use Ollama on a phone would be great and a GUI app will be needed to manage the LLMs on a mobile device. I am glad they made a Windows and MacOS GUI app but I would recommend they make a cross platform app to support as many devices as possible.
Author
Owner

@linuxkernel94 commented on GitHub (Aug 6, 2025):

Please add support for Linux, many companies benefit from this OS and the open-source model, but that doesn’t materialize when it comes to supporting Linux users.

<!-- gh-comment-id:3157538218 --> @linuxkernel94 commented on GitHub (Aug 6, 2025): Please add support for Linux, many companies benefit from this OS and the open-source model, but that doesn’t materialize when it comes to supporting Linux users.
Author
Owner

@Mephistofex commented on GitHub (Aug 8, 2025):

I am using Ollama with Open WebUi on Linux for comfortable testing out new models. But would prefer a Ollama GUI app, as a all-in-one solution.

Also i was first confused by the Ollama download page, if the gui app is available for linux Website Download Page is Confusing for Linux Users #11668 .

<!-- gh-comment-id:3167160145 --> @Mephistofex commented on GitHub (Aug 8, 2025): I am using Ollama with Open WebUi on Linux for comfortable testing out new models. But would prefer a Ollama GUI app, as a all-in-one solution. Also i was first confused by the Ollama download page, if the gui app is available for linux [Website Download Page is Confusing for Linux Users #11668 ](https://github.com/ollama/ollama/issues/11668).
Author
Owner

@landsman commented on GitHub (Oct 30, 2025):

Kotlin Multiplatform would be nice for this use-case.

<!-- gh-comment-id:3465499990 --> @landsman commented on GitHub (Oct 30, 2025): Kotlin Multiplatform would be nice for this use-case.
Author
Owner

@wpostma commented on GitHub (Mar 28, 2026):

I have a working Linux version I could PR.
linux_ui_feature branch on https://github.com/wpostma/ollama

Engineering by me, heavy lifting by Claude Code/Opus 4.6.

<!-- gh-comment-id:4146163777 --> @wpostma commented on GitHub (Mar 28, 2026): I have a working Linux version I could PR. linux_ui_feature branch on https://github.com/wpostma/ollama Engineering by me, heavy lifting by Claude Code/Opus 4.6.
Author
Owner

@lizzi193 commented on GitHub (Apr 10, 2026):

I also think it is very strange, that a simple to install and intuitive usable GUI ist exclusively provided to Windows and MacOS while Linux end users without technical skills are completely excluded.

Unfortunately, the persistent myth that Linux desktop operating systems are only used by developers and tech enthusiasts still seems to be beaked-in to the brains of people who develop applications.

But sure, in 2026 this this old myth is no longer valid.

Since the revelation of End-of-10 in 2021 and at the very latest with start of 2025, more and more end-users switched from Windows to several Linux Distributions. Today, Linux is used by many pure end-users who simply cannot be expected to work with the CLI or the complicated setup of an OpenwebUI (Docker).

I consider this feature request not a matter of luxury debate but rather a question of self-evident accessibility for people with little to no technical skills. Every person should have the opportunity to use Ollama (and any other application) intuitively out-of-the-box, not just those who use Windows or macOS.

Please, dear official developers, and please, dear Linux tech enthusiasts: no one will be deprived of a single line of CLI if, on the other hand, it is made possible for everyone to use an application in the simplest and most intuitive way possible.

Please.

<!-- gh-comment-id:4223551565 --> @lizzi193 commented on GitHub (Apr 10, 2026): I also think it is very strange, that a simple to install and intuitive usable GUI ist exclusively provided to Windows and MacOS while Linux end users without technical skills are completely excluded. Unfortunately, the persistent myth that Linux desktop operating systems are only used by developers and tech enthusiasts still seems to be beaked-in to the brains of people who develop applications. But sure, in 2026 this this old myth is no longer valid. Since the revelation of End-of-10 in 2021 and at the very latest with start of 2025, more and more end-users switched from Windows to several Linux Distributions. Today, Linux is used by many pure end-users who simply cannot be expected to work with the CLI or the complicated setup of an OpenwebUI (Docker). I consider this feature request not a matter of luxury debate but rather a question of self-evident accessibility for people with little to no technical skills. Every person should have the opportunity to use Ollama (and any other application) intuitively out-of-the-box, not just those who use Windows or macOS. Please, dear official developers, and please, dear Linux tech enthusiasts: no one will be deprived of a single line of CLI if, on the other hand, it is made possible for everyone to use an application in the simplest and most intuitive way possible. Please.
Author
Owner

@wpostma commented on GitHub (Apr 15, 2026):

Please.

You can build my version. (see link)

<!-- gh-comment-id:4248797637 --> @wpostma commented on GitHub (Apr 15, 2026): > Please. You can build my version. (see link)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#33427