[GH-ISSUE #2671] New issues connecting to certain LiteLLM proxy models / "Expected last role to be one of" #51641

Closed
opened 2026-05-05 12:43:18 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @seandearnaley on GitHub (May 30, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2671

Bug Report

Description

Bug Summary:
New issues connecting to certain LiteLLM models

Steps to Reproduce:

I use the litellm/config.yaml file to integrate multiple LLM models into my UI. Everything worked smoothly until recently when I encountered issues connecting to the Mistral and Perplexity models.

To diagnose the issue, I tested the litellm proxy independently using CURL commands, ensuring the correct "user" role and the same config.yaml configuration I use in open-webui. While other models function correctly in open-webui, the Mistral and Perplexity models remain inaccessible. This leads me to suspect that the litellm proxy might need an update or there is some other bug. The latest version of litellm is 1.39.4, whereas openwebui is on version 1.35.28. Below are the logs and the CURL commands that work when testing litellm on its own:

curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Content-Type: application/json' \
--data-raw '{
  "model": "perplexity/llama-3-70b-instruct",
  "messages": [
    {
      "role": "user",
      "content": "what llm are you"
    }
  ]
}'
curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Content-Type: application/json' \
--data-raw '{
  "model": "mistral/mistral-large-latest",
  "messages": [
    {
      "role": "user",
      "content": "what llm are you"
    }
  ]
}'

Logs and Screenshots

MistralException - Error code: 400 - {'object': 'error', 'message': 'Expected last role to be one of: [user, tool] but got assistant', 'type': 'invalid_request_error', 'param': None, 'code': None}
External: PerplexityException - Error code: 400 - {'error': {'message': 'Last message must have role `user`.', 'type': 'invalid_message', 'code': 400}}

Environment

  • Open WebUI Version: [e.g., 0.1.125]

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
Originally created by @seandearnaley on GitHub (May 30, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/2671 # Bug Report ## Description **Bug Summary:** New issues connecting to certain LiteLLM models **Steps to Reproduce:** I use the `litellm/config.yaml` file to integrate multiple LLM models into my UI. Everything worked smoothly until recently when I encountered issues connecting to the Mistral and Perplexity models. To diagnose the issue, I tested the `litellm` proxy independently using CURL commands, ensuring the correct "user" role and the same `config.yaml` configuration I use in open-webui. While other models function correctly in open-webui, the Mistral and Perplexity models remain inaccessible. This leads me to suspect that the `litellm` proxy might need an update or there is some other bug. The latest version of `litellm` is 1.39.4, whereas `openwebui` is on version 1.35.28. Below are the logs and the CURL commands that work when testing `litellm` on its own: ```sh curl --location 'http://0.0.0.0:4000/chat/completions' \ --header 'Content-Type: application/json' \ --data-raw '{ "model": "perplexity/llama-3-70b-instruct", "messages": [ { "role": "user", "content": "what llm are you" } ] }' ``` ```sh curl --location 'http://0.0.0.0:4000/chat/completions' \ --header 'Content-Type: application/json' \ --data-raw '{ "model": "mistral/mistral-large-latest", "messages": [ { "role": "user", "content": "what llm are you" } ] }' ``` ## Logs and Screenshots ``` MistralException - Error code: 400 - {'object': 'error', 'message': 'Expected last role to be one of: [user, tool] but got assistant', 'type': 'invalid_request_error', 'param': None, 'code': None} ``` ``` External: PerplexityException - Error code: 400 - {'error': {'message': 'Last message must have role `user`.', 'type': 'invalid_message', 'code': 400}} ``` ## Environment - **Open WebUI Version:** [e.g., 0.1.125] ## Reproduction Details **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs.
Author
Owner

@tjbck commented on GitHub (May 30, 2024):

Our latest dev has removed bundled LiteLLM support, I'd recommend you start migrating your LiteLLM config.yaml to a self-hosted LiteLLM instance. You'd still be able to add them to our webui via OpenAI Connections. Thanks for your understanding!

<!-- gh-comment-id:2140519654 --> @tjbck commented on GitHub (May 30, 2024): Our latest dev has removed bundled LiteLLM support, I'd recommend you start migrating your LiteLLM config.yaml to a self-hosted LiteLLM instance. You'd still be able to add them to our webui via OpenAI Connections. Thanks for your understanding!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#51641