[GH-ISSUE #6369] enh: pip open-webui[min] installation #53004

Closed
opened 2026-05-05 14:13:49 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @AntonKrug on GitHub (Oct 23, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/6369

Feature Request

Is your feature request related to a problem? Please describe.
The docker has a option for the cases when you run ollama and models on a separate machine, however local pip install is installing the fullest set of nvidia and cuda drivers which i will not use. Is there a way to make a very minimalistic install just for the front end UI without the bulky rest?

Describe the solution you'd like
Cleaner separation of backend and frontend when installing through pip.

Describe alternatives you've considered
I was trying to install it as is, but it's really unecesary bulky, the torch library depends on ram, the nvidia cuda drivers depend on storage, and they will not even get used as the machine run diferent a VM where there is a GPU passed through, has most of the ram and most of the storage allocated. I would strongly prefer to have this separation even if the UI would end up having some features disabled. Forcing me make huge LXC container. Making multi GB LXC container feels to me almost abusing the concept of lightweight containers. At what point it's actually maybe better to make a dedicated VM instead of container.

Additional context
I do not want to use docker as I already have LXC support in my proxmox and so far everything can be done through LXC containers, installing docker for just one app feels to me overkill.

Originally created by @AntonKrug on GitHub (Oct 23, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/6369 # Feature Request **Is your feature request related to a problem? Please describe.** The docker has a option for the cases when you run ollama and models on a separate machine, however local pip install is installing the fullest set of nvidia and cuda drivers which i will not use. Is there a way to make a very minimalistic install just for the front end UI without the bulky rest? **Describe the solution you'd like** Cleaner separation of backend and frontend when installing through pip. **Describe alternatives you've considered** I was trying to install it as is, but it's really unecesary bulky, the torch library depends on ram, the nvidia cuda drivers depend on storage, and they will not even get used as the machine run diferent a VM where there is a GPU passed through, has most of the ram and most of the storage allocated. I would strongly prefer to have this separation even if the UI would end up having some features disabled. Forcing me make huge LXC container. Making multi GB LXC container feels to me almost abusing the concept of lightweight containers. At what point it's actually maybe better to make a dedicated VM instead of container. **Additional context** I do not want to use docker as I already have LXC support in my proxmox and so far everything can be done through LXC containers, installing docker for just one app feels to me overkill.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#53004