[GH-ISSUE #5469] LiteLLM "Budget has been exceeded!" error is translated to "Bad Request" #52657

Closed
opened 2026-05-05 13:45:24 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @vogtp on GitHub (Sep 17, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/5469

Bug Report

Installation Method

docker

Environment

  • Open WebUI Version: v0.3.21

  • LiteLLM Version: 1.46.1

  • Operating System: [Ubuntu 24.04.1 LTS

  • Browser (if applicable): Google Chrome 128.0.6613.84

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

Display the http error to the user:
{"error":{"message":"Budget has been exceeded! Current cost: 0.3759, Max budget: 0.35","type":"budget_exceeded","param":null,"code":"400"}}

The content of error.message would give a nice error message:
Uh-oh! There was an issue connecting to GPT-4.
Budget has been exceeded! Current cost: 0.3759, Max budget: 0.35

Actual Behavior:

The User gets a generic error message:
Uh-oh! There was an issue connecting to GPT-4.
External: 400, message='Bad Request', url='http://host.docker.internal:4444/chat/completions'

Description

Bug Summary:
Openwebui does not pass on error messages of litellm and therefore confuses the user.

Reproduction Details

Steps to Reproduce:

  1. Limit the budget of the LiteLLM proxy: https://docs.litellm.ai/docs/proxy/users
  2. Add LiteLLM as OpenAI Endpoint
  3. Query the limited llm

Logs and Screenshots

Browser Console Logs:
{
"detail": "External: 400, message='Bad Request', url='http://host.docker.internal:4444/chat/completions'"
}

(anonymous) @ Chat.svelte:743
await in (anonymous)
$e @ Chat.svelte:688
await in $e
rt @ Chat.svelte:604
await in rt
_t @ MessageInput.svelte:547

Docker Container Logs:
INFO: 10.3.2.161:0 - "GET /static/favicon.png HTTP/1.1" 200 OK
ERROR [open_webui.apps.openai.main] 400, message='Bad Request', url='http://host.docker.internal:4444/chat/completions'
Traceback (most recent call last):
File "/app/backend/open_webui/apps/openai/main.py", line 438, in generate_chat_completion
r.raise_for_status()
File "/usr/local/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 1093, in raise_for_status
raise ClientResponseError(
aiohttp.client_exceptions.ClientResponseError: 400, message='Bad Request', url='http://host.docker.internal:4444/chat/completions'
INFO: 10.3.2.161:0 - "POST /api/chat/completions HTTP/1.1" 400 Bad Request
INFO: 10.3.2.161:0 - "POST /api/v1/chats/b350b35f-d8ba-4c2c-afa4-9300cbbd6dd5 HTTP/1.1" 200 OK
INFO: 10.3.2.161:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO [open_webui.apps.ollama.main] url: http://llama-1.its.unibas.ch:11434
generate_title
llama3.1:latest
INFO: 10.3.2.161:0 - "POST /api/task/title/completions HTTP/1.1" 200 OK
INFO: 10.3.2.161:0 - "POST /api/v1/chats/b350b35f-d8ba-4c2c-afa4-9300cbbd6dd5 HTTP/1.1" 200 OK
INFO: 10.3.2.161:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
INFO: 10.3.2.161:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK

Screenshots/Screen Recordings (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @vogtp on GitHub (Sep 17, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/5469 # Bug Report ## Installation Method docker ## Environment - **Open WebUI Version:** v0.3.21 - **LiteLLM Version:** 1.46.1 - **Operating System:** [Ubuntu 24.04.1 LTS - **Browser (if applicable):** Google Chrome 128.0.6613.84 **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: Display the http error to the user: {"error":{"message":"Budget has been exceeded! Current cost: 0.3759, Max budget: 0.35","type":"budget_exceeded","param":null,"code":"400"}} The content of error.message would give a nice error message: Uh-oh! There was an issue connecting to GPT-4. Budget has been exceeded! Current cost: 0.3759, Max budget: 0.35 ## Actual Behavior: The User gets a generic error message: Uh-oh! There was an issue connecting to GPT-4. External: 400, message='Bad Request', url='http://host.docker.internal:4444/chat/completions' ## Description **Bug Summary:** Openwebui does not pass on error messages of litellm and therefore confuses the user. ## Reproduction Details **Steps to Reproduce:** 1. Limit the budget of the LiteLLM proxy: https://docs.litellm.ai/docs/proxy/users 2. Add LiteLLM as OpenAI Endpoint 3. Query the limited llm ## Logs and Screenshots **Browser Console Logs:** { "detail": "External: 400, message='Bad Request', url='http://host.docker.internal:4444/chat/completions'" } (anonymous) @ Chat.svelte:743 await in (anonymous) $e @ Chat.svelte:688 await in $e rt @ Chat.svelte:604 await in rt _t @ MessageInput.svelte:547 **Docker Container Logs:** INFO: 10.3.2.161:0 - "GET /static/favicon.png HTTP/1.1" 200 OK ERROR [open_webui.apps.openai.main] 400, message='Bad Request', url='http://host.docker.internal:4444/chat/completions' Traceback (most recent call last): File "/app/backend/open_webui/apps/openai/main.py", line 438, in generate_chat_completion r.raise_for_status() File "/usr/local/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 1093, in raise_for_status raise ClientResponseError( aiohttp.client_exceptions.ClientResponseError: 400, message='Bad Request', url='http://host.docker.internal:4444/chat/completions' INFO: 10.3.2.161:0 - "POST /api/chat/completions HTTP/1.1" 400 Bad Request INFO: 10.3.2.161:0 - "POST /api/v1/chats/b350b35f-d8ba-4c2c-afa4-9300cbbd6dd5 HTTP/1.1" 200 OK INFO: 10.3.2.161:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK INFO [open_webui.apps.ollama.main] url: http://llama-1.its.unibas.ch:11434 generate_title llama3.1:latest INFO: 10.3.2.161:0 - "POST /api/task/title/completions HTTP/1.1" 200 OK INFO: 10.3.2.161:0 - "POST /api/v1/chats/b350b35f-d8ba-4c2c-afa4-9300cbbd6dd5 HTTP/1.1" 200 OK INFO: 10.3.2.161:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK INFO: 10.3.2.161:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK **Screenshots/Screen Recordings (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@GrayXu commented on GitHub (Sep 17, 2024):

To add, the Azure OpenAI also reports a bad request when triggering the filter, which can indeed be misleading to users.

<!-- gh-comment-id:2356424186 --> @GrayXu commented on GitHub (Sep 17, 2024): To add, the Azure OpenAI also reports a bad request when triggering the filter, which can indeed be misleading to users.
Author
Owner

@tjbck commented on GitHub (Sep 19, 2024):

Should be fixed on dev, let me know if the issue persists!

<!-- gh-comment-id:2361246498 --> @tjbck commented on GitHub (Sep 19, 2024): Should be fixed on dev, let me know if the issue persists!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#52657