[GH-ISSUE #14463] issue: regression on v0.6.12 with RAG #17264

Closed
opened 2026-04-19 22:58:12 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @bb-chris on GitHub (May 29, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/14463

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

v0.6.12

Ollama Version (if applicable)

n/a

Operating System

Ubuntu 22.04

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

RAG should work without errors

Actual Behavior

I'm getting this error when using RAG after upgrading to v0.6.12. Downgrading to v0.6.11 makes it go away:

400: litellm.BadRequestError: BedrockException - {"message":"The additional field system conflicts with an existing field. Remove system and try again."}. Received Model Group=anthropic.claude-sonnet-4-20250514-v1:0 Available Model Group Fallbacks=None

Using this same model WITHOUT RAG works just fine.

Steps to Reproduce

  • Setup RAG on v0.6.12
  • I'm using LiteLLM with Bedrock models
  • Bedrock is also used for both embedding and reranking
  • Upload PDFs as knowledge and create a model pointing at that knowledge (basic RAG setup as per OWUI docs)

Logs & Screenshots

400: litellm.BadRequestError: BedrockException - {"message":"The additional field system conflicts with an existing field. Remove system and try again."}. Received Model Group=anthropic.claude-sonnet-4-20250514-v1:0 Available Model Group Fallbacks=None

Additional Information

No response

Originally created by @bb-chris on GitHub (May 29, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/14463 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version v0.6.12 ### Ollama Version (if applicable) n/a ### Operating System Ubuntu 22.04 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior RAG should work without errors ### Actual Behavior I'm getting this error when using RAG after upgrading to v0.6.12. Downgrading to v0.6.11 makes it go away: `400: litellm.BadRequestError: BedrockException - {"message":"The additional field system conflicts with an existing field. Remove system and try again."}. Received Model Group=anthropic.claude-sonnet-4-20250514-v1:0 Available Model Group Fallbacks=None` Using this same model WITHOUT RAG works just fine. ### Steps to Reproduce - Setup RAG on v0.6.12 - I'm using LiteLLM with Bedrock models - Bedrock is also used for both embedding and reranking - Upload PDFs as knowledge and create a model pointing at that knowledge (basic RAG setup as per OWUI docs) ### Logs & Screenshots `400: litellm.BadRequestError: BedrockException - {"message":"The additional field system conflicts with an existing field. Remove system and try again."}. Received Model Group=anthropic.claude-sonnet-4-20250514-v1:0 Available Model Group Fallbacks=None` ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-19 22:58:12 -05:00
Author
Owner

@Hisma commented on GitHub (May 29, 2025):

what content extraction engine are you using?
Verify in Admin Document settings.

<!-- gh-comment-id:2918126936 --> @Hisma commented on GitHub (May 29, 2025): what content extraction engine are you using? Verify in Admin Document settings.
Author
Owner

@stanislavKosacek commented on GitHub (May 29, 2025):

I have same issue with every custom model with OpenAI models.
400: litellm.BadRequestError: OpenAIException - Unrecognized request argument supplied: system. Received Model Group=gpt-4.1 Available Model Group Fallbacks=None.
I also use connection to LiteLLM. It starts after update to Open WebUI v0.6.12.

Image
<!-- gh-comment-id:2918560160 --> @stanislavKosacek commented on GitHub (May 29, 2025): I have same issue with every custom model with OpenAI models. `400: litellm.BadRequestError: OpenAIException - Unrecognized request argument supplied: system. Received Model Group=gpt-4.1 Available Model Group Fallbacks=None`. I also use connection to LiteLLM. It starts after update to Open WebUI v0.6.12. <img width="1139" alt="Image" src="https://github.com/user-attachments/assets/73092261-5e2d-4e03-bed5-f6c953f064b5" />
Author
Owner

@tjbck commented on GitHub (May 29, 2025):

This has nothing to do with RAG and more to do with custom params.

#14469

<!-- gh-comment-id:2918729523 --> @tjbck commented on GitHub (May 29, 2025): This has nothing to do with RAG and more to do with custom params. #14469
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#17264