[GH-ISSUE #470] feat: package as executable (desktop app) #12076

Closed
opened 2026-04-19 18:51:02 -05:00 by GiteaMirror · 38 comments
Owner

Originally created by @tjbck on GitHub (Jan 13, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/470

Ideally we would want something like Ollama

Maybe we could use https://pyinstaller.org/en/stable/

Originally created by @tjbck on GitHub (Jan 13, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/470 Ideally we would want something like Ollama Maybe we could use https://pyinstaller.org/en/stable/
GiteaMirror added the enhancementgood first issuehelp wantedcore labels 2026-04-19 18:51:02 -05:00
Author
Owner

@justinh-rahb commented on GitHub (Jan 14, 2024):

More options for deployment are always a great idea. Packaging on macOS can bit a bit of a chore, you'll need a $99/yr developer account to create signed/notarized binaries. Not sure if you need to pay for signing Windows binaries or not, but it's at least easier to bypass for the user if you don't.

<!-- gh-comment-id:1890806933 --> @justinh-rahb commented on GitHub (Jan 14, 2024): More options for deployment are always a great idea. Packaging on macOS can bit a bit of a chore, you'll need a $99/yr developer account to create signed/notarized binaries. Not sure if you need to pay for signing Windows binaries or not, but it's at least easier to bypass for the user if you don't.
Author
Owner

@oliverbob commented on GitHub (Jan 19, 2024):

More options for deployment are always a great idea. Packaging on macOS can bit a bit of a chore, you'll need a $99/yr developer account to create signed/notarized binaries. Not sure if you need to pay for signing Windows binaries or not, but it's at least easier to bypass for the user if you don't.

While it is seems impractical to do it with Mac OS, it isn't the case on Windows and Linux, Android, etc. But I believe there is a way to package on multiplatform using flutter.

Thought that would perhaps mean creating a new sibling for flutter project. It allows you to create apps for desktop, and according to flutter docs, including iOS. Tried it in the past but didn't have mac. It works on all platforms though. The snap store also works as executable. Tried it in some apple devices to work.

<!-- gh-comment-id:1899897975 --> @oliverbob commented on GitHub (Jan 19, 2024): > More options for deployment are always a great idea. Packaging on macOS can bit a bit of a chore, you'll need a $99/yr developer account to create signed/notarized binaries. Not sure if you need to pay for signing Windows binaries or not, but it's at least easier to bypass for the user if you don't. While it is seems impractical to do it with Mac OS, it isn't the case on Windows and Linux, Android, etc. But I believe there is a way to package on multiplatform using flutter. Thought that would perhaps mean creating a new sibling for flutter project. It allows you to create apps for desktop, and according to flutter docs, including iOS. Tried it in the past but didn't have mac. It works on all platforms though. The snap store also works as executable. Tried it in some apple devices to work.
Author
Owner

@justinh-rahb commented on GitHub (Feb 15, 2024):

On the packaging front, I am actively looking into Flatpak:

Manifest:

app-id: org.ollama-webui.Ollama-WebUI
runtime: org.freedesktop.Platform
runtime-version: '22.08'
sdk: org.freedesktop.Sdk
command: start-webui.sh
finish-args:
  - --share=ipc
  - --socket=x11
  - --socket=wayland
  - --share=network
  - --filesystem=home
  - --device=dri
  - --env=ENV=prod
  - --env=SCARF_NO_ANALYTICS=true
  - --env=DO_NOT_TRACK=true
modules:
  - name: nodejs
    buildsystem: simple
    build-commands:
      - npm install
      - npm run build
      - mkdir -p /app/frontend
      - cp -r build/* /app/frontend/
    sources:
      - type: git
        url: https://github.com/ollama-webui/ollama-webui
        tag: main  # Specify the correct branch or tag here
      - type: archive
        url: https://chroma-onnx-models.s3.amazonaws.com/all-MiniLM-L6-v2/onnx.tar.gz
        dest: /app/data/onnx_models
  - name: python-backend
    buildsystem: simple
    build-commands:
      - pip3 install --no-cache-dir torch torchvision torchaudio -f https://download.pytorch.org/whl/cpu
      - pip3 install --no-cache-dir -r backend/requirements.txt
      - install -D backend/start.sh /app/bin/start-webui.sh
      - cp -r backend/* /app/backend/
      - cp -a /app/data/onnx_models /root/.cache/chroma/onnx_models
    sources:
      - type: git
        url: https://github.com/ollama-webui/ollama-webui
        tag: main  # Specify the correct branch or tag here
    post-install:
      - mkdir -p /app/bin
      - echo -e '#!/bin/sh\nexec /app/backend/start.sh' > /app/bin/start-webui.sh
      - chmod +x /app/bin/start-webui.sh

Workflow:

name: Build and Release Flatpak

on:
  push:
    tags:
      - 'v*'

permissions:
  contents: write
  id-token: write

jobs:
  flatpak-build-and-release:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout repository
        uses: actions/checkout@v4

      - name: Install Flatpak and Flatpak Builder
        run: |
          sudo apt-get update -y
          sudo apt-get install -y flatpak flatpak-builder

      - name: Add Flathub repository
        run: flatpak remote-add --if-not-exists flathub https://flathub.org/repo/flathub.flatpakrepo

      - name: Build Flatpak application
        run: flatpak-builder --force-clean build-dir org.ollama.WebUI.yaml

      - name: Create bundle
        run: flatpak build-bundle build-dir ollama-webui.flatpak org.ollama.WebUI

      - name: Create Release and Upload Asset
        uses: softprops/action-gh-release@v1
        with:
          name: ${{ github.ref_name }}
          body: |
            ## Release ${{ github.ref_name }} of Ollama WebUI
            
            ### 🚀 New Features
            - List new features here
            - Improvements or bug fixes
      
            ### 📦 Installation Instructions
            To install Ollama WebUI on your system, follow these steps:
            1. Ensure Flatpak is installed on your system.
            2. Download `ollama-webui.flatpak`.
            3. Install the application using `flatpak install ollama-webui.flatpak`.
      
          tag_name: ${{ github.ref_name }}
          files: |
            ollama-webui.flatpak
          token: ${{ secrets.GITHUB_TOKEN }}
          draft: true # Adjust these as preferred
          prerelease: true # Adjust these as preferred

I'll wait til after the rename to start the PR.

<!-- gh-comment-id:1946353391 --> @justinh-rahb commented on GitHub (Feb 15, 2024): On the packaging front, I am actively looking into Flatpak: **Manifest:** ```yaml app-id: org.ollama-webui.Ollama-WebUI runtime: org.freedesktop.Platform runtime-version: '22.08' sdk: org.freedesktop.Sdk command: start-webui.sh finish-args: - --share=ipc - --socket=x11 - --socket=wayland - --share=network - --filesystem=home - --device=dri - --env=ENV=prod - --env=SCARF_NO_ANALYTICS=true - --env=DO_NOT_TRACK=true modules: - name: nodejs buildsystem: simple build-commands: - npm install - npm run build - mkdir -p /app/frontend - cp -r build/* /app/frontend/ sources: - type: git url: https://github.com/ollama-webui/ollama-webui tag: main # Specify the correct branch or tag here - type: archive url: https://chroma-onnx-models.s3.amazonaws.com/all-MiniLM-L6-v2/onnx.tar.gz dest: /app/data/onnx_models - name: python-backend buildsystem: simple build-commands: - pip3 install --no-cache-dir torch torchvision torchaudio -f https://download.pytorch.org/whl/cpu - pip3 install --no-cache-dir -r backend/requirements.txt - install -D backend/start.sh /app/bin/start-webui.sh - cp -r backend/* /app/backend/ - cp -a /app/data/onnx_models /root/.cache/chroma/onnx_models sources: - type: git url: https://github.com/ollama-webui/ollama-webui tag: main # Specify the correct branch or tag here post-install: - mkdir -p /app/bin - echo -e '#!/bin/sh\nexec /app/backend/start.sh' > /app/bin/start-webui.sh - chmod +x /app/bin/start-webui.sh ``` **Workflow:** ```yaml name: Build and Release Flatpak on: push: tags: - 'v*' permissions: contents: write id-token: write jobs: flatpak-build-and-release: runs-on: ubuntu-latest steps: - name: Checkout repository uses: actions/checkout@v4 - name: Install Flatpak and Flatpak Builder run: | sudo apt-get update -y sudo apt-get install -y flatpak flatpak-builder - name: Add Flathub repository run: flatpak remote-add --if-not-exists flathub https://flathub.org/repo/flathub.flatpakrepo - name: Build Flatpak application run: flatpak-builder --force-clean build-dir org.ollama.WebUI.yaml - name: Create bundle run: flatpak build-bundle build-dir ollama-webui.flatpak org.ollama.WebUI - name: Create Release and Upload Asset uses: softprops/action-gh-release@v1 with: name: ${{ github.ref_name }} body: | ## Release ${{ github.ref_name }} of Ollama WebUI ### 🚀 New Features - List new features here - Improvements or bug fixes ### 📦 Installation Instructions To install Ollama WebUI on your system, follow these steps: 1. Ensure Flatpak is installed on your system. 2. Download `ollama-webui.flatpak`. 3. Install the application using `flatpak install ollama-webui.flatpak`. tag_name: ${{ github.ref_name }} files: | ollama-webui.flatpak token: ${{ secrets.GITHUB_TOKEN }} draft: true # Adjust these as preferred prerelease: true # Adjust these as preferred ``` I'll wait til after the rename to start the PR.
Author
Owner

@tjbck commented on GitHub (Feb 16, 2024):

We should also have some thing like these from oobabooga/text-generation-webui, so that users can install without docker with one command as well and streamline the installation process:

https://github.com/oobabooga/text-generation-webui/blob/main/start_linux.sh
https://github.com/oobabooga/text-generation-webui/blob/main/start_macos.sh
https://github.com/oobabooga/text-generation-webui/blob/main/start_windows.bat

<!-- gh-comment-id:1949410947 --> @tjbck commented on GitHub (Feb 16, 2024): We should also have some thing like these from oobabooga/text-generation-webui, so that users can install without docker with one command as well and streamline the installation process: https://github.com/oobabooga/text-generation-webui/blob/main/start_linux.sh https://github.com/oobabooga/text-generation-webui/blob/main/start_macos.sh https://github.com/oobabooga/text-generation-webui/blob/main/start_windows.bat
Author
Owner

@justinh-rahb commented on GitHub (Feb 16, 2024):

We should also have some thing like these from oobabooga/text-generation-webui, so that users can install without docker with one command as well and streamline the installation process:

I had thought of doing exactly this, I've got macOS and Linux mostly hammered out already for my own purposes. Since we're on the same page I'll get those polished up and ready for PR too 👍

<!-- gh-comment-id:1949422482 --> @justinh-rahb commented on GitHub (Feb 16, 2024): > We should also have some thing like these from oobabooga/text-generation-webui, so that users can install without docker with one command as well and streamline the installation process: > I had thought of doing exactly this, I've got macOS and Linux mostly hammered out already for my own purposes. Since we're on the same page I'll get those polished up and ready for PR too 👍
Author
Owner

@tjbck commented on GitHub (Feb 22, 2024):

We might also want to look into providing installation option using pip install

<!-- gh-comment-id:1958468423 --> @tjbck commented on GitHub (Feb 22, 2024): We might also want to look into providing installation option using [`pip install` ](https://pypi.org/)
Author
Owner

@dz0ny commented on GitHub (Feb 25, 2024):

One can use pipx it's far better than pip which often influnces whatever else you have installed on device.

<!-- gh-comment-id:1962980143 --> @dz0ny commented on GitHub (Feb 25, 2024): One can use [pipx](https://github.com/pypa/pipx) it's far better than pip which often influnces whatever else you have installed on device.
Author
Owner

@justinh-rahb commented on GitHub (Feb 25, 2024):

@dz0ny another we attempted before and might try again once they've worked out some bugs is uv

<!-- gh-comment-id:1962983770 --> @justinh-rahb commented on GitHub (Feb 25, 2024): @dz0ny another we attempted before and might try again once they've worked out some bugs is [uv](https://astral.sh/blog/uv) - #758
Author
Owner

@cocktailpeanut commented on GitHub (Feb 29, 2024):

Hey guys, i created one https://x.com/cocktailpeanut/status/1763254738177462672

Basically I work on a project called pinokio, which is sort of like a browser but for automating anything on your computer, which can be used for installing and running and managing AI apps in native format (no need to mess with terminal stuff).

And I became a fan of this project lately and have been using it daily, so decided to write a 1 click launcher script for this. Hope you enjoy.

<!-- gh-comment-id:1971671916 --> @cocktailpeanut commented on GitHub (Feb 29, 2024): Hey guys, i created one https://x.com/cocktailpeanut/status/1763254738177462672 Basically I work on a project called pinokio, which is sort of like a browser but for automating anything on your computer, which can be used for installing and running and managing AI apps in native format (no need to mess with terminal stuff). And I became a fan of this project lately and have been using it daily, so decided to write a 1 click launcher script for this. Hope you enjoy.
Author
Owner

@justinh-rahb commented on GitHub (Feb 29, 2024):

That's awesome @cocktailpeanut, glad to see you've kept busy. Thanks for the shoutout! 🫶

<!-- gh-comment-id:1971704747 --> @justinh-rahb commented on GitHub (Feb 29, 2024): That's awesome @cocktailpeanut, glad to see you've kept busy. Thanks for the shoutout! 🫶
Author
Owner

@J-eremy commented on GitHub (Mar 11, 2024):

There are large issues when it comes to Windows and pretty much all Python packagers. Especially when it comes to single file packaged. Windows Defender picks up everything that isn't signed (See extortion). If Microsoft could get away not allowing you to install anything 3rd party not approved by them they would do that definitely. See S-Mode

<!-- gh-comment-id:1989197619 --> @J-eremy commented on GitHub (Mar 11, 2024): There are large issues when it comes to Windows and pretty much all Python packagers. Especially when it comes to single file packaged. Windows Defender picks up everything that isn't signed (See extortion). If Microsoft could get away not allowing you to install anything 3rd party not approved by them they would do that definitely. See S-Mode
Author
Owner

@nightboysfm commented on GitHub (Mar 23, 2024):

We should also have some thing like these from oobabooga/text-generation-webui, so that users can install without docker with one command as well and streamline the installation process:

https://github.com/oobabooga/text-generation-webui/blob/main/start_linux.sh https://github.com/oobabooga/text-generation-webui/blob/main/start_macos.sh https://github.com/oobabooga/text-generation-webui/blob/main/start_windows.bat

Here it is, a one-click installation.
Create open-webui.bat, copy paste the content below and put in a folder (no spaces in the paths !), launch it and let the magic happening. It will download and put all the dependencies in a subfolder and create a python venv to keep things clean. Git is the only thing installed on system.
I think some paths are hard written in config files, so if you move the folder you might need to fix errors or just reinstall the thing.
It should be some improvement to do, but it works fine like this.

Content of the "open-webui.bat": https://pastebin.com/527wvn0k

If you want to enter the existing venv and make changes you can make a "cmd_venv.bat" (or a name you like) and start it: https://pastebin.com/wNArfua2

I hope it will help.

<!-- gh-comment-id:2016579290 --> @nightboysfm commented on GitHub (Mar 23, 2024): > We should also have some thing like these from oobabooga/text-generation-webui, so that users can install without docker with one command as well and streamline the installation process: > > https://github.com/oobabooga/text-generation-webui/blob/main/start_linux.sh https://github.com/oobabooga/text-generation-webui/blob/main/start_macos.sh https://github.com/oobabooga/text-generation-webui/blob/main/start_windows.bat Here it is, a one-click installation. Create open-webui.bat, copy paste the content below and put in a folder (no spaces in the paths !), launch it and let the magic happening. It will download and put all the dependencies in a subfolder and create a python venv to keep things clean. Git is the only thing installed on system. I think some paths are hard written in config files, so if you move the folder you might need to fix errors or just reinstall the thing. It should be some improvement to do, but it works fine like this. Content of the "open-webui.bat": https://pastebin.com/527wvn0k If you want to enter the existing venv and make changes you can make a "cmd_venv.bat" (or a name you like) and start it: https://pastebin.com/wNArfua2 I hope it will help.
Author
Owner

@tjbck commented on GitHub (Mar 23, 2024):

@nightboysfm Looks promising, feel free to make a PR!

<!-- gh-comment-id:2016598489 --> @tjbck commented on GitHub (Mar 23, 2024): @nightboysfm Looks promising, feel free to make a PR!
Author
Owner

@muhanstudio commented on GitHub (May 28, 2024):

This will be an excellent milestone. If you can achieve the Desktop app, or even the Android APP, you can change the condition of the LLM API application. We know that most of the excellent open source models are not good clients without good clients. , But maybe this project can bring hope, quickly bring the open source model to everyone who only clicks the EXE file or the APK file.

<!-- gh-comment-id:2135908575 --> @muhanstudio commented on GitHub (May 28, 2024): This will be an excellent milestone. If you can achieve the Desktop app, or even the Android APP, you can change the condition of the LLM API application. We know that most of the excellent open source models are not good clients without good clients. , But maybe this project can bring hope, quickly bring the open source model to everyone who only clicks the EXE file or the APK file.
Author
Owner

@motin commented on GitHub (Jun 7, 2024):

How about forking Ollama, re-using all that existing packaging already in place for OSX, Windows, Linux and also bundle open webui next to ollama?

<!-- gh-comment-id:2155282765 --> @motin commented on GitHub (Jun 7, 2024): How about forking Ollama, re-using all that existing packaging already in place for OSX, Windows, Linux and also bundle open webui next to ollama?
Author
Owner

@justinh-rahb commented on GitHub (Jun 7, 2024):

How about forking Ollama, re-using all that existing packaging already in place for OSX, Windows, Linux and also bundle open webui next to ollama?

Packaging isn't the hard part, doing it accordingly with with certificate signing for various platforms is the hairy part everyone wishes to avoid.

<!-- gh-comment-id:2155478920 --> @justinh-rahb commented on GitHub (Jun 7, 2024): > How about forking Ollama, re-using all that existing packaging already in place for OSX, Windows, Linux and also bundle open webui next to ollama? Packaging isn't the hard part, doing it accordingly with with certificate signing for various platforms is the hairy part everyone wishes to avoid.
Author
Owner

@motin commented on GitHub (Jun 9, 2024):

Packaging isn't the hard part, doing it accordingly with with certificate signing for various platforms is the hairy part everyone wishes to avoid.

I'd include certificate signing in the scope of packaging. Ollama has it figured out technically and it is obviously working (https://github.com/ollama/ollama/blob/main/.github/workflows/release.yaml) and the overlap of users installing Ollama desktop app and Open WebUI must be large. Agree that organizationally there are other challenges in how to set up and maintain the various developer programs/registrations, if that is what you mean as the hairy part I agree, but one route is to officially be shipped with the Ollama desktop app, using Ollama's certificates.

<!-- gh-comment-id:2156313998 --> @motin commented on GitHub (Jun 9, 2024): > Packaging isn't the hard part, doing it accordingly with with certificate signing for various platforms is the hairy part everyone wishes to avoid. I'd include certificate signing in the scope of packaging. Ollama has it figured out technically and it is obviously working (https://github.com/ollama/ollama/blob/main/.github/workflows/release.yaml) and the overlap of users installing Ollama desktop app and Open WebUI must be large. Agree that organizationally there are other challenges in how to set up and maintain the various developer programs/registrations, if that is what you mean as the hairy part I agree, but one route is to officially be shipped with the Ollama desktop app, using Ollama's certificates.
Author
Owner

@justinh-rahb commented on GitHub (Jun 10, 2024):

I'd include certificate signing in the scope of packaging. Ollama has it figured out technically and it is obviously working (https://github.com/ollama/ollama/blob/main/.github/workflows/release.yaml) and the overlap of users installing Ollama desktop app and Open WebUI must be large. Agree that organizationally there are other challenges in how to set up and maintain the various developer programs/registrations, if that is what you mean as the hairy part I agree, but one route is to officially be shipped with the Ollama desktop app, using Ollama's certificates.

While I'd love nothing more than the ultimate Ollama x WebUI collab by having them include us in their installer packages, I highly doubt they'd go for that. And I'd understand their reasons why probably.

Indeed you are correct, the main sticking point is managing the various developer accounts and credentials required with the platform-owners, which are also not without cost.

<!-- gh-comment-id:2158294598 --> @justinh-rahb commented on GitHub (Jun 10, 2024): > I'd include certificate signing in the scope of packaging. Ollama has it figured out technically and it is obviously working (https://github.com/ollama/ollama/blob/main/.github/workflows/release.yaml) and the overlap of users installing Ollama desktop app and Open WebUI must be large. Agree that organizationally there are other challenges in how to set up and maintain the various developer programs/registrations, if that is what you mean as the hairy part I agree, but one route is to officially be shipped with the Ollama desktop app, using Ollama's certificates. While I'd love nothing more than the ultimate Ollama x WebUI collab by having them include us in their installer packages, I highly doubt they'd go for that. And I'd understand their reasons why probably. Indeed you are correct, the main sticking point is managing the various developer accounts and credentials required with the platform-owners, which are also not without cost.
Author
Owner

@motin commented on GitHub (Jun 10, 2024):

How about a bring-your-own-credentials model, where it is made easy to package and sign executables / desktop apps, but by default they are not signed (Ollama does this by only signing if SIGN=1 env var is set when running the bundling scripts). This way, anyone that needs signed artifacts (e.g. for distribution within their org) can do it themselves by following simple instructions.

<!-- gh-comment-id:2159109693 --> @motin commented on GitHub (Jun 10, 2024): How about a bring-your-own-credentials model, where it is made easy to package and sign executables / desktop apps, but by default they are not signed (Ollama does this by only signing if SIGN=1 env var is set when running the bundling scripts). This way, anyone that needs signed artifacts (e.g. for distribution within their org) can do it themselves by following simple instructions.
Author
Owner

@snakeying commented on GitHub (Jun 17, 2024):

As I mentioned before, this is undoubtedly the best open-source UI I have ever experienced on GitHub. However, for most people new to Docker, the installation process can be a nightmare. Even with detailed instructions, they still find the CLI too technical. Over the past few days, I recommended open-webui to several friends, but they all complained about the lack of a straightforward installation method. They preferred AnythingLLM because it only requires downloading and clicking to install, without the need for WSL, Docker, or similar tools. This proves that a simple installation process is a crucial aspect of the user experience.

I suggest we offer installation prompts similar to AnythingLLM, such as those found here: https://docs.useanything.com/installation/desktop/windows. We could add the following to our installation guide:

Application is not signed!
➤ The Open-webui Windows application is currently unsigned and Windows Defender or other anti-virus software might flag the application as malicious.
➤ If you do not want to bypass that alert for any reason, please use open-webui in another way.

I understand this is not a perfect solution, but it at least provides more options for users and helps improve their overall experience. As @muhanstudio said, "quickly bring the open-source model to everyone who only clicks the EXE file or the APK file." Isn't user-friendliness one of the most important features of open-webui?

I sincerely hope this project continues to thrive and bring a better experience to more people. Once again, thank you to the development team and all the contributors for your hard work.

<!-- gh-comment-id:2172713519 --> @snakeying commented on GitHub (Jun 17, 2024): As I mentioned before, this is undoubtedly the best open-source UI I have ever experienced on GitHub. However, for most people new to Docker, the installation process can be a nightmare. Even with detailed instructions, they still find the CLI too technical. Over the past few days, I recommended open-webui to several friends, but they all complained about the lack of a straightforward installation method. They preferred AnythingLLM because it only requires downloading and clicking to install, without the need for WSL, Docker, or similar tools. This proves that a simple installation process is a crucial aspect of the user experience. I suggest we offer installation prompts similar to AnythingLLM, such as those found here: https://docs.useanything.com/installation/desktop/windows. We could add the following to our installation guide: ``` Application is not signed! ➤ The Open-webui Windows application is currently unsigned and Windows Defender or other anti-virus software might flag the application as malicious. ➤ If you do not want to bypass that alert for any reason, please use open-webui in another way. ``` I understand this is not a perfect solution, but it at least provides more options for users and helps improve their overall experience. As @muhanstudio said, "quickly bring the open-source model to everyone who only clicks the EXE file or the APK file." Isn't user-friendliness one of the most important features of open-webui? I sincerely hope this project continues to thrive and bring a better experience to more people. Once again, thank you to the development team and all the contributors for your hard work.
Author
Owner

@RustoMCSpit commented on GitHub (Jul 4, 2024):

They preferred AnythingLLM because it only requires downloading and clicking to install, without the need for WSL, Docker, or similar tools

my friends use podman and havent heard of docker (surprisingly) so the only docker instructions confused them

<!-- gh-comment-id:2208994189 --> @RustoMCSpit commented on GitHub (Jul 4, 2024): > They preferred AnythingLLM because it only requires downloading and clicking to install, without the need for WSL, Docker, or similar tools my friends use podman and havent heard of docker (surprisingly) so the only docker instructions confused them
Author
Owner

@zhouxihong1 commented on GitHub (Aug 1, 2024):

I have packaged a Windows executable program for open-webui. Those in need can click the link to download it. It also supports modifying the .env file and is suitable for Windows x64.

https://github.com/zhouxihong1/open-webui/releases/download/V0.3.10/start_open_webui.7z

<!-- gh-comment-id:2261771541 --> @zhouxihong1 commented on GitHub (Aug 1, 2024): I have packaged a Windows executable program for open-webui. Those in need can click the link to download it. It also supports modifying the .env file and is suitable for Windows x64. https://github.com/zhouxihong1/open-webui/releases/download/V0.3.10/start_open_webui.7z
Author
Owner

@steveepreston commented on GitHub (Aug 8, 2024):

agree with @snakeying!

the installation process can be a nightmare. Even with detailed instructions

after becoming familiar with this package and not finding an installer in releases section, I was about to give up.
In despair, I looked at the issues list and finding this issue gave me hope that it would be available.

aside from the painful installation of wsl and Docker, my main concern is that running the program in Docker may have slower performance and higher memory consumption compared to running it directly in Windows.

<!-- gh-comment-id:2274671670 --> @steveepreston commented on GitHub (Aug 8, 2024): agree with @snakeying! > the installation process can be a nightmare. Even with detailed instructions after becoming familiar with this package and not finding an installer in releases section, I was about to give up. In despair, I looked at the issues list and finding this issue gave me hope that it would be available. aside from the painful installation of wsl and Docker, my main concern is that running the program in Docker may have slower performance and higher memory consumption compared to running it directly in Windows.
Author
Owner

@steveepreston commented on GitHub (Aug 10, 2024):

thank you @zhouxihong1!
i downloaded and started your version successfully via start_open_webui serve.
im was not sure where to add models, so i have gone to Models section in localhost:8080/admin/settings but got the error: Ollama API is disabled while ollama is up.

<!-- gh-comment-id:2282028377 --> @steveepreston commented on GitHub (Aug 10, 2024): thank you @zhouxihong1! i downloaded and started your version successfully via `start_open_webui serve`. im was not sure where to add models, so i have gone to Models section in `localhost:8080/admin/settings` but got the error: `Ollama API is disabled` while ollama is up.
Author
Owner

@zhouxihong1 commented on GitHub (Aug 10, 2024):

you can edit env file or change web setting.

<!-- gh-comment-id:2282163313 --> @zhouxihong1 commented on GitHub (Aug 10, 2024): you can edit env file or change web setting.
Author
Owner

@steveepreston commented on GitHub (Aug 10, 2024):

thank you man! worked like a charm!
i suggest to enable it as default for next releases:

ENABLE_OLLAMA_API=true
OLLAMA_BASE_URL='http://localhost:11434'
<!-- gh-comment-id:2282176565 --> @steveepreston commented on GitHub (Aug 10, 2024): thank you man! worked like a charm! i suggest to enable it as default for next releases: ```.env ENABLE_OLLAMA_API=true OLLAMA_BASE_URL='http://localhost:11434' ```
Author
Owner

@steveepreston commented on GitHub (Aug 10, 2024):

@zhouxihong1 as 0.3.12 released in org repo and continuing, can you please create it's automation script/action for creating exe version and PL to here? so all can make sure it's always updated and covered officially ❤️

<!-- gh-comment-id:2282180392 --> @steveepreston commented on GitHub (Aug 10, 2024): @zhouxihong1 as 0.3.12 released in org repo and continuing, can you please create it's automation script/action for creating exe version and PL to here? so all can make sure it's always updated and covered officially ❤️
Author
Owner

@zhouxihong1 commented on GitHub (Aug 10, 2024):

You're welcome, thank you for your suggestion. Compiling open-webui is not too difficult for me

<!-- gh-comment-id:2282181011 --> @zhouxihong1 commented on GitHub (Aug 10, 2024): You're welcome, thank you for your suggestion. Compiling open-webui is not too difficult for me
Author
Owner

@zhouxihong1 commented on GitHub (Aug 10, 2024):

We will consider adding automated script compilation in the future

<!-- gh-comment-id:2282181290 --> @zhouxihong1 commented on GitHub (Aug 10, 2024): We will consider adding automated script compilation in the future
Author
Owner

@jerrychoices commented on GitHub (Aug 30, 2024):

我们会考虑在未来添加自动脚本编译
Thank you very much for sharing. Actually, I am also working on openwebui packaging and I hope to have some communication with you

<!-- gh-comment-id:2320212011 --> @jerrychoices commented on GitHub (Aug 30, 2024): > 我们会考虑在未来添加自动脚本编译 Thank you very much for sharing. Actually, I am also working on openwebui packaging and I hope to have some communication with you
Author
Owner

@zhouxihong1 commented on GitHub (Aug 30, 2024):

Sure, I'll develop a Python script later that automates the compilation process with PyInstaller, making it easier to compile Open-WebUI within a virtual environment (venv).

可以,我将在晚些时间开发一个自动编译的python脚本,以方便能够通过pyinstaller 自动完成venv下的open-webui编译

<!-- gh-comment-id:2320266860 --> @zhouxihong1 commented on GitHub (Aug 30, 2024): Sure, I'll develop a Python script later that automates the compilation process with PyInstaller, making it easier to compile Open-WebUI within a virtual environment (venv). 可以,我将在晚些时间开发一个自动编译的python脚本,以方便能够通过pyinstaller 自动完成venv下的open-webui编译
Author
Owner

@steveepreston commented on GitHub (Aug 30, 2024):

Hey @zhouxihong1!
version 0.3.16 released, Your last exe is for 0.3.10
I hope your efforts succeed soon. Thank you!

<!-- gh-comment-id:2320462427 --> @steveepreston commented on GitHub (Aug 30, 2024): Hey @zhouxihong1! version [0.3.16](https://github.com/open-webui/open-webui/releases/tag/v0.3.16) released, Your last exe is for [0.3.10](https://github.com/zhouxihong1/open-webui/releases/download/V0.3.10/start_open_webui.7z) I hope your efforts succeed soon. Thank you!
Author
Owner

@zhouxihong1 commented on GitHub (Sep 5, 2024):

I wrote a script to compile open-webui, and it has passed self-testing on the latest version 0.3.18. If you need it, feel free to take it, and please give me a star.

https://github.com/zhouxihong1/build_start_open_webui

<!-- gh-comment-id:2330605521 --> @zhouxihong1 commented on GitHub (Sep 5, 2024): I wrote a script to compile open-webui, and it has passed self-testing on the latest version 0.3.18. If you need it, feel free to take it, and please give me a star. https://github.com/zhouxihong1/build_start_open_webui
Author
Owner

@jerrychoices commented on GitHub (Sep 6, 2024):

Great job. Currently, I have learned that some workers are using Tauri to package Svelte frontend applications. I am not familiar with Rust language. If someone familiar with Rust language could write a startup script to start Python backend before frontend startup, then open frontend and build it with Tauri, it would be a particularly good offline desktop application

<!-- gh-comment-id:2333002931 --> @jerrychoices commented on GitHub (Sep 6, 2024): Great job. Currently, I have learned that some workers are using Tauri to package Svelte frontend applications. I am not familiar with Rust language. If someone familiar with Rust language could write a startup script to start Python backend before frontend startup, then open frontend and build it with Tauri, it would be a particularly good offline desktop application
Author
Owner

@muhanstudio commented on GitHub (Oct 10, 2024):

I see many excellent examples of packaging and running the entire program, but is it possible for us to package the frontend as an independent app, distribute it across various platforms, and then just have the user input the backend address where we have deployed the server so that the client can run smoothly? I believe this could greatly reduce the pressure of dependency on the environment and make multi-platform distribution much more feasible. For the frontend alone, there are many mature cross-platform frameworks, such as Electron, Tauri, and WebView for mobile platforms.

<!-- gh-comment-id:2403686031 --> @muhanstudio commented on GitHub (Oct 10, 2024): I see many excellent examples of packaging and running the entire program, but is it possible for us to package the frontend as an independent app, distribute it across various platforms, and then just have the user input the backend address where we have deployed the server so that the client can run smoothly? I believe this could greatly reduce the pressure of dependency on the environment and make multi-platform distribution much more feasible. For the frontend alone, there are many mature cross-platform frameworks, such as Electron, Tauri, and WebView for mobile platforms.
Author
Owner

@jerrychoices commented on GitHub (Oct 16, 2024):

I see many excellent examples of packaging and running the entire program, but is it possible for us to package the frontend as an independent app, distribute it across various platforms, and then just have the user input the backend address where we have deployed the server so that the client can run smoothly? I believe this could greatly reduce the pressure of dependency on the environment and make multi-platform distribution much more feasible. For the frontend alone, there are many mature cross-platform frameworks, such as Electron, Tauri, and WebView for mobile platforms.

yeah i been complite do tauri package it but it so complicate i dont know how to promote it ,not only just backend package but frontend and ollama serve and model files now i dont have elegant and concise way to integrate one

<!-- gh-comment-id:2415579091 --> @jerrychoices commented on GitHub (Oct 16, 2024): > I see many excellent examples of packaging and running the entire program, but is it possible for us to package the frontend as an independent app, distribute it across various platforms, and then just have the user input the backend address where we have deployed the server so that the client can run smoothly? I believe this could greatly reduce the pressure of dependency on the environment and make multi-platform distribution much more feasible. For the frontend alone, there are many mature cross-platform frameworks, such as Electron, Tauri, and WebView for mobile platforms. yeah i been complite do tauri package it but it so complicate i dont know how to promote it ,not only just backend package but frontend and ollama serve and model files now i dont have elegant and concise way to integrate one
Author
Owner

@clicktodev commented on GitHub (Oct 19, 2024):

Yes please, a packaged binary would be really helpful for non developers and easier installation of the app

<!-- gh-comment-id:2423906934 --> @clicktodev commented on GitHub (Oct 19, 2024): Yes please, a packaged binary would be really helpful for non developers and easier installation of the app
Author
Owner

@tjbck commented on GitHub (Jan 9, 2025):

Closing in favour of #8262

I'm actively working on this atm.

https://github.com/open-webui/app

<!-- gh-comment-id:2579051762 --> @tjbck commented on GitHub (Jan 9, 2025): Closing in favour of #8262 I'm actively working on this atm. https://github.com/open-webui/app
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#12076