mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-25 04:24:30 -05:00
feat: notebook mode (text completion) #388
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @robertvazan on GitHub (Mar 1, 2024).
Originally assigned to: @tjbck on GitHub.
Is your feature request related to a problem? Please describe.
Ollama and cloud APIs support simple text generation (continuations) in addition to chat, but this is not exposed in WebUI.
Describe the solution you'd like
Maybe start with something like this: https://github.com/oobabooga/screenshots/raw/main/print_default.png
With some shortcuts (tab to accept next word from continuation) and automatic regeneration when the prompt changes, this would allow for fairly convenient assisted writing. KV cache will keep it performant unless you go back and edit something at the beginning of the prompt.
Another option is to offer several short continuations.
Describe alternatives you've considered
Instructing the model to continue the story is unwieldy, requires instruction-tuned model, and works poorly with smaller models. It's possible to use a specialized tool, but I already have Ollama & WebUI set up (I have not researched the market comprehensively yet, so not sure what's available).
@jannikstdl commented on GitHub (Mar 1, 2024):
Hi, have you tried this?

@robertvazan commented on GitHub (Mar 1, 2024):
@jannikstdl That's not what I have in mind. That button just encourages the LLM to write a longer response. I need the LLM to suggest the next few words for what I am writing. Base models are intended to be used like that. I could edit the response to contain beginning of what I am writing and then request continuation, but that's impractical unless the LLM is able to predict long stretches of text reliably, which the small ones cannot do. I need something more streamlined. The other issue is that this uses instruction-tuned model, which will insist on being an assistant, whereas (I hope that) the base models available in Ollama are more flexible.
@robertvazan commented on GitHub (Mar 1, 2024):
I am thinking of something akin to NovelAI.
@jannikstdl commented on GitHub (Mar 1, 2024):
I think this is something like the mention feature.

But as far as i know its WIP.
@robertvazan commented on GitHub (Mar 1, 2024):
@jannikstdl Still not it. I don't want to produce a conversation between two AIs/characters. I want to write arbitrary text and have the AI suggest next few words / list items / even whole paragraphs. Try the free NovelAI (no registration) to see what I mean.