[GH-ISSUE #10843] Feature: Queue multiple messages #54716

Closed
opened 2026-05-05 16:36:04 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @thistleknot on GitHub (Feb 27, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/10843

Sometimes working code I'd like to ask the llm to respond in a serial fashion to the inputs I'm sending). Kind of like batch. Where I can send one function at a time for the llm to translate.

An alternative might be using an agent to read a file, and process one incoming message at a time in a serial fashion appending to a chat memory.

Originally created by @thistleknot on GitHub (Feb 27, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/10843 Sometimes working code I'd like to ask the llm to respond in a serial fashion to the inputs I'm sending). Kind of like batch. Where I can send one function at a time for the llm to translate. An alternative might be using an agent to read a file, and process one incoming message at a time in a serial fashion appending to a chat memory.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#54716