mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #2671] New issues connecting to certain LiteLLM proxy models / "Expected last role to be one of" #51641
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @seandearnaley on GitHub (May 30, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2671
Bug Report
Description
Bug Summary:
New issues connecting to certain LiteLLM models
Steps to Reproduce:
I use the
litellm/config.yamlfile to integrate multiple LLM models into my UI. Everything worked smoothly until recently when I encountered issues connecting to the Mistral and Perplexity models.To diagnose the issue, I tested the
litellmproxy independently using CURL commands, ensuring the correct "user" role and the sameconfig.yamlconfiguration I use in open-webui. While other models function correctly in open-webui, the Mistral and Perplexity models remain inaccessible. This leads me to suspect that thelitellmproxy might need an update or there is some other bug. The latest version oflitellmis 1.39.4, whereasopenwebuiis on version 1.35.28. Below are the logs and the CURL commands that work when testinglitellmon its own:Logs and Screenshots
Environment
Reproduction Details
Confirmation:
@tjbck commented on GitHub (May 30, 2024):
Our latest dev has removed bundled LiteLLM support, I'd recommend you start migrating your LiteLLM config.yaml to a self-hosted LiteLLM instance. You'd still be able to add them to our webui via OpenAI Connections. Thanks for your understanding!