[GH-ISSUE #7673] Infinitely increased output length limit despite models maximum output length with Google Gemini 1.5 Flash #53503

Closed
opened 2026-05-05 14:50:13 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @0xGitGuy on GitHub (Dec 7, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/7673

Is there a way to implement Google LLM like Gemini 1.5 flash? This large context window LLM are useful to prune and modify text files. There also needs to be a way to bypass the 8096 output length limit.

Originally created by @0xGitGuy on GitHub (Dec 7, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/7673 Is there a way to implement Google LLM like Gemini 1.5 flash? This large context window LLM are useful to prune and modify text files. There also needs to be a way to bypass the 8096 output length limit.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#53503