mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
400 error in UI on context overflow #1847
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @ddzina on GitHub (Aug 22, 2024).
Bug Report
Installation Method
OpenWebui via docker image in docker compose with LiteLLM container
Environment
Open WebUI Version: [v0.3.11]
LiteLLM: [v1.40.17]
Operating System: [Ubuntu 22.04]
Confirmation:
Expected Behavior:
If user message exceeds the context limit of the model, user must receive information about it, not the error which doesn't tell what went wrong.
Actual Behavior:
If user sends large message and context overflows, model responds with 400 error which confuses the user.
Description
Bug Summary:
User receives uninformative 400 error when LLM's context overflows.
Reproduction Details
Steps to Reproduce:
Logs and Screenshots
Browser Console Logs:
Docker Container Logs:
OpenWebUI
LiteLLM
Screenshots/Screen Recordings (if applicable):

Additional Information
Despite the fact that I send logs from case with LiteLLM, the behaviour is the same when using direct OpenAI Connection via OpenAI API. So it is not the issue on the LiteLLM side. Moreover, the same error occurs if user sends a few large messages and each of them fit the context, but summarized context exceeds the limit. By the way, LiteLLM catches the ContextWindowExceededError, so I ask to implement logic to manage such error.
@tjbck commented on GitHub (Aug 23, 2024):
If you could provide a detailed guide on how to reproduce with OpenAI (not LiteLLM) API, I'd greatly appreciate it.