mirror of
https://github.com/open-webui/open-webui.git
synced 2026-04-23 18:18:49 -05:00
Feature Request: Browser-Based LLM inference #3127
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @gloryknight on GitHub (Dec 27, 2024).
Feature Request
I propose integrating https://github.com/ngxson/wllama, a powerful open-source library for WebAssembly binding for llama.cpp, enabling in-browser LLM inference as a selectable model option within the web UI. This integration would allow users to run GGUF models, directly within their browser.
Local execution could be implemented as a checkbox in the model capabilities (see admin panes->models->click on model).
By leveraging wllama, we can combine the user-friendly interface of open-webui with local inference in the browser. This offers several potential benefits for our users:
wllama is available under the permissive MIT License, making it a suitable choice for integration into our existing web application.