mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-10 07:43:10 -05:00
feat: Minimal Installation Option for Open WebUI #4503
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @dror-llm on GitHub (Mar 20, 2025).
Problem Description
Currently, when installing Open WebUI via pip (
pip install open-webui), the installation includes a full set of dependencies including NVIDIA CUDA drivers, PyTorch, and various ML libraries that total approximately 7.7GB. However, many users only need the frontend UI and backend server to connect to a remote Ollama instance running on a separate machine, making these heavy dependencies unnecessary.When installing via pip, the following large packages are included even when they won't be used:
The Docker implementation already has an option to run with a remote Ollama instance, but the pip installation doesn't offer a lightweight alternative.
Desired Solution you'd like
Create a minimal installation option for Open WebUI that:
Possible implementation options:
Create optional dependency groups in pyproject.toml:
Users could then install with:
pip install open-webui[minimal]Create an installation script based on the Dockerfile's conditional logic:
USE_CUDA,USE_OLLAMA, etc.)Provide a separate pip package like
open-webui-minimalthat only includes the necessary components for connecting to a remote Ollama instance.Alternatives Considered
Using Docker with the remote Ollama option - this works but many users prefer a direct pip installation for simplicity, especially in environments where Docker isn't available or preferred.
Creating a custom installation by manually removing packages after installation - this is cumbersome and might break functionality.
Building from source with custom options - this requires more technical knowledge than a simple pip install.
Additional Context
Benefits of this feature include:
Technical considerations:
@dror-llm commented on GitHub (Mar 20, 2025):
Oops, sorry about this.
That's what happens when you let an llm open issues. Sometimes they get confused and post it to the wrong repo.