[GH-ISSUE #2492] litellm report error code 400 when send the a Zero content length request #12901

Closed
opened 2026-04-19 19:44:16 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @hiddenblue on GitHub (May 22, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2492

Bug Report

the litellm plugin in openwebui docker image is used to get compatibility with some LLM except Chatgpt, especilly with some openai compatible model such as qwen, chatglm and so on.
however, the openwebui will send a zero length request to the litellm proxy server, which relay the request with openai compatible api. This zero length request is forbidden by openai.
So, the litellm will occasionally report Error code: 400.

Description

Bug Summary:
[Provide a brief but clear summary of the bug]

Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]

Expected Behavior:
[Describe what you expected to happen.]

Actual Behavior:
[Describe what actually happened.]

Environment

  • Open WebUI Version: v0.1.125

  • Ollama (if applicable): [e.g., 0.1.30, 0.1.32-rc1]

  • Operating System: [e.g., Windows 10, macOS Big Sur, Ubuntu 20.04]
    ubuntu24.04 KDE

  • Browser (if applicable): [e.g., Chrome 100.0, Firefox 98.0]
    edge 124.0.2478.97
    firefox

Reproduction Details

add some openai compatible LLM by litellm config file.
and send some message to these llm models

**USE "openai" prefix in "model" arg to get compatibility **
(because the litellm didn't support qwen and chatglm LLM officially. And these LLM servers provide openai compatible api for calling.
refer to this docs https://litellm.vercel.app/docs/providers/openai_compatible

model_list:
  - model_name: qwen-turbo
    litellm_params:
      model: openai/qwen-turbo
      api_base: https://dashscope.aliyuncs.com/compatible-mode/v1
      api_key: xxxxxxxx

and send some message to these LLM via openwebui

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

image

image
image

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]
image

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

install by docker
[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

image

I copy the litellm request information from docker logs, and try to send the request to my LLM remote api manually

and I can reproduce the problem.

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @hiddenblue on GitHub (May 22, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/2492 # Bug Report the litellm plugin in openwebui docker image is used to get compatibility with some LLM except Chatgpt, especilly with some openai compatible model such as qwen, chatglm and so on. however, the openwebui will send a zero length request to the litellm proxy server, which relay the request with openai compatible api. This zero length request is forbidden by openai. So, the litellm will occasionally report Error code: 400. ## Description **Bug Summary:** [Provide a brief but clear summary of the bug] **Steps to Reproduce:** [Outline the steps to reproduce the bug. Be as detailed as possible.] **Expected Behavior:** [Describe what you expected to happen.] **Actual Behavior:** [Describe what actually happened.] ## Environment - **Open WebUI Version:** v0.1.125 - **Ollama (if applicable):** [e.g., 0.1.30, 0.1.32-rc1] - **Operating System:** [e.g., Windows 10, macOS Big Sur, Ubuntu 20.04] ubuntu24.04 KDE - **Browser (if applicable):** [e.g., Chrome 100.0, Firefox 98.0] edge 124.0.2478.97 firefox ## Reproduction Details add some openai compatible LLM by litellm config file. and send some message to these llm models **USE "openai" prefix in "model" arg to get compatibility ** (because the litellm didn't support qwen and chatglm LLM officially. And these LLM servers provide openai compatible api for calling. refer to this docs [https://litellm.vercel.app/docs/providers/openai_compatible](url) ```yaml model_list: - model_name: qwen-turbo litellm_params: model: openai/qwen-turbo api_base: https://dashscope.aliyuncs.com/compatible-mode/v1 api_key: xxxxxxxx ``` and send some message to these LLM via openwebui **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. ## Logs and Screenshots ![image](https://github.com/open-webui/open-webui/assets/62304226/bf2a7669-538f-4b59-8356-8a0404361764) ![image](https://github.com/open-webui/open-webui/assets/62304226/4e7e9c1c-5ba8-41a7-9602-0db6c5f7c502) ![image](https://github.com/open-webui/open-webui/assets/62304226/6f8d3b3a-d449-40bf-bed8-5ebce6dc7c5b) **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** [Include relevant Docker container logs, if applicable] ![image](https://github.com/open-webui/open-webui/assets/62304226/f6e9d054-0cab-4ff7-a269-6a49d960a926) **Screenshots (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Installation Method install by docker [Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.] ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ![image](https://github.com/open-webui/open-webui/assets/62304226/889ab2dd-513e-4b51-949e-518853092f2e) I copy the litellm request information from docker logs, and try to send the request to my LLM remote api manually and I can reproduce the problem. ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#12901