[GH-ISSUE #1549] API success but error from litellm #12547

Closed
opened 2026-04-19 19:28:10 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @nijisakai on GitHub (Apr 14, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1549

17:27:33 - LiteLLM:INFO: 

POST Request Sent from LiteLLM:
curl -X POST \
http://172.16.185.252:4001/v1/ \
-H 'Authorization: Bearer sk-UFWEmOmdBMA1dKt210E349B6D706405********************' \
-d '{'model': 'ERNIE-4.0-8K', 'messages': [{'role': 'user', 'content': '为以下查询创建一个简洁的、3-5个词的短语作为标题,严格遵守3-5个词的限制并避免使用“标题”一词: Tell me a random fun fact about the Roman Empire'}], 'stream': False, 'extra_body': {}}'


INFO:LiteLLM:

POST Request Sent from LiteLLM:
curl -X POST \
http://172.16.185.252:4001/v1/ \
-H 'Authorization: Bearer sk-UFWEmOmdBMA1dKt210E349B6D706405********************' \
-d '{'model': 'ERNIE-4.0-8K', 'messages': [{'role': 'user', 'content': '为以下查询创建一个简洁的、3-5个词的短语作为标题,严格遵守3-5个词的限制并避免使用“标题”一词: Tell me a random fun fact about the Roman Empire'}], 'stream': False, 'extra_body': {}}'


INFO:httpx:HTTP Request: POST http://172.16.185.252:4001/v1/chat/completions "HTTP/1.1 200 OK"
17:27:33 - LiteLLM Router:INFO: litellm.acompletion(model=openai/ERNIE-4.0-8K) Exception Invalid response object Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 6329, in convert_to_model_response_object
    for idx, choice in enumerate(response_object["choices"]):
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: 'NoneType' object is not iterable

INFO:LiteLLM Router:litellm.acompletion(model=openai/ERNIE-4.0-8K) Exception Invalid response object Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 6329, in convert_to_model_response_object
    for idx, choice in enumerate(response_object["choices"]):
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: 'NoneType' object is not iterable

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 6766, in exception_type
    message = original_exception.message
              ^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'Exception' object has no attribute 'message'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 2821, in chat_completion
    responses = await asyncio.gather(
                ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 395, in acompletion
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 391, in acompletion
    response = await self.async_function_with_fallbacks(**kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 1176, in async_function_with_fallbacks
    raise original_exception
  File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 1099, in async_function_with_fallbacks
    response = await self.async_function_with_retries(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 1235, in async_function_with_retries
    raise original_exception
  File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 1193, in async_function_with_retries
    response = await original_function(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 478, in _acompletion
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 454, in _acompletion
    response = await litellm.acompletion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 3147, in wrapper_async
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2988, in wrapper_async
    result = await original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 292, in acompletion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 8033, in exception_type
    raise original_exception
  File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 279, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 437, in acompletion
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 432, in acompletion
    return convert_to_model_response_object(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 6437, in convert_to_model_response_object
    raise Exception(f"Invalid response object {traceback.format_exc()}")
Exception: Invalid response object Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 6329, in convert_to_model_response_object
    for idx, choice in enumerate(response_object["choices"]):
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: 'NoneType' object is not iterable

ERROR:backoff:Giving up chat_completion(...) after 1 tries (litellm.proxy.proxy_server.ProxyException)
INFO:     172.24.235.74:8647 - "POST /litellm/api/v1/chat/completions HTTP/1.1" 500 Internal Server Error
INFO:     172.24.235.74:8647 - "POST /api/v1/chats/ad0dd6d0-8bdf-417a-a2c7-2adf74ee9e93 HTTP/1.1" 200 OK
INFO:     172.24.235.74:8647 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO:     172.24.235.74:8647 - "GET /api/v1/chats/ HTTP/1.1" 200 OK

Originally created by @nijisakai on GitHub (Apr 14, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/1549 ``` 17:27:33 - LiteLLM:INFO: POST Request Sent from LiteLLM: curl -X POST \ http://172.16.185.252:4001/v1/ \ -H 'Authorization: Bearer sk-UFWEmOmdBMA1dKt210E349B6D706405********************' \ -d '{'model': 'ERNIE-4.0-8K', 'messages': [{'role': 'user', 'content': '为以下查询创建一个简洁的、3-5个词的短语作为标题,严格遵守3-5个词的限制并避免使用“标题”一词: Tell me a random fun fact about the Roman Empire'}], 'stream': False, 'extra_body': {}}' INFO:LiteLLM: POST Request Sent from LiteLLM: curl -X POST \ http://172.16.185.252:4001/v1/ \ -H 'Authorization: Bearer sk-UFWEmOmdBMA1dKt210E349B6D706405********************' \ -d '{'model': 'ERNIE-4.0-8K', 'messages': [{'role': 'user', 'content': '为以下查询创建一个简洁的、3-5个词的短语作为标题,严格遵守3-5个词的限制并避免使用“标题”一词: Tell me a random fun fact about the Roman Empire'}], 'stream': False, 'extra_body': {}}' INFO:httpx:HTTP Request: POST http://172.16.185.252:4001/v1/chat/completions "HTTP/1.1 200 OK" 17:27:33 - LiteLLM Router:INFO: litellm.acompletion(model=openai/ERNIE-4.0-8K) Exception Invalid response object Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 6329, in convert_to_model_response_object for idx, choice in enumerate(response_object["choices"]): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: 'NoneType' object is not iterable INFO:LiteLLM Router:litellm.acompletion(model=openai/ERNIE-4.0-8K) Exception Invalid response object Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 6329, in convert_to_model_response_object for idx, choice in enumerate(response_object["choices"]): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: 'NoneType' object is not iterable Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 6766, in exception_type message = original_exception.message ^^^^^^^^^^^^^^^^^^^^^^^^^^ AttributeError: 'Exception' object has no attribute 'message' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 2821, in chat_completion responses = await asyncio.gather( ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 395, in acompletion raise e File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 391, in acompletion response = await self.async_function_with_fallbacks(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 1176, in async_function_with_fallbacks raise original_exception File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 1099, in async_function_with_fallbacks response = await self.async_function_with_retries(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 1235, in async_function_with_retries raise original_exception File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 1193, in async_function_with_retries response = await original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 478, in _acompletion raise e File "/usr/local/lib/python3.11/site-packages/litellm/router.py", line 454, in _acompletion response = await litellm.acompletion( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 3147, in wrapper_async raise e File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 2988, in wrapper_async result = await original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 292, in acompletion raise exception_type( ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 8033, in exception_type raise original_exception File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 279, in acompletion response = await init_response ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 437, in acompletion raise e File "/usr/local/lib/python3.11/site-packages/litellm/llms/openai.py", line 432, in acompletion return convert_to_model_response_object( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 6437, in convert_to_model_response_object raise Exception(f"Invalid response object {traceback.format_exc()}") Exception: Invalid response object Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 6329, in convert_to_model_response_object for idx, choice in enumerate(response_object["choices"]): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: 'NoneType' object is not iterable ERROR:backoff:Giving up chat_completion(...) after 1 tries (litellm.proxy.proxy_server.ProxyException) INFO: 172.24.235.74:8647 - "POST /litellm/api/v1/chat/completions HTTP/1.1" 500 Internal Server Error INFO: 172.24.235.74:8647 - "POST /api/v1/chats/ad0dd6d0-8bdf-417a-a2c7-2adf74ee9e93 HTTP/1.1" 200 OK INFO: 172.24.235.74:8647 - "GET /api/v1/chats/ HTTP/1.1" 200 OK INFO: 172.24.235.74:8647 - "GET /api/v1/chats/ HTTP/1.1" 200 OK ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#12547