Feature: Queue multiple messages #4159

Closed
opened 2025-11-11 15:47:00 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @thistleknot on GitHub (Feb 27, 2025).

Sometimes working code I'd like to ask the llm to respond in a serial fashion to the inputs I'm sending). Kind of like batch. Where I can send one function at a time for the llm to translate.

An alternative might be using an agent to read a file, and process one incoming message at a time in a serial fashion appending to a chat memory.

Originally created by @thistleknot on GitHub (Feb 27, 2025). Sometimes working code I'd like to ask the llm to respond in a serial fashion to the inputs I'm sending). Kind of like batch. Where I can send one function at a time for the llm to translate. An alternative might be using an agent to read a file, and process one incoming message at a time in a serial fashion appending to a chat memory.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#4159