Problem in weg rag search when using a Litellm model #1262

Closed
opened 2025-11-11 14:41:20 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @flefevre on GitHub (Jun 14, 2024).

Bug Report

Description

Bug Summary:
When i use "web search feature", it works a little with ollama model. If i switch to model that is managed by Litellm, i get an error.
I do precise that the model managed by Litellm works fine in openwebui when not using the web search rag feature

`mixtral-8x7b 12:25 PM

Searched 3 sites
Uh-oh! There was an issue connecting to mixtral-8x7b.
External: 400, message='Bad Request', url=URL('http://litellm:8000/v1/chat/completions')`

Steps to Reproduce:
-in open webui, load an ollama model, activate the web search, as a question on the news in France >> works

  • in open webui, load a litellm model, activate the web search, as a question on the news in France >> does not work

Expected Behavior:
shouold work the same way with models served by ollama or by litellm

Actual Behavior:
[Describe what actually happened.]

Environment

  • **Open WebUI Version:v0.3.4

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • [x ] I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
Originally created by @flefevre on GitHub (Jun 14, 2024). # Bug Report ## Description **Bug Summary:** When i use "web search feature", it works a little with ollama model. If i switch to model that is managed by Litellm, i get an error. I do precise that the model managed by Litellm works fine in openwebui when not using the web search rag feature `mixtral-8x7b 12:25 PM Searched 3 sites Uh-oh! There was an issue connecting to mixtral-8x7b. External: 400, message='Bad Request', url=URL('http://litellm:8000/v1/chat/completions')` **Steps to Reproduce:** -in open webui, load an ollama model, activate the web search, as a question on the news in France >> works - in open webui, load a litellm model, activate the web search, as a question on the news in France >> does not work **Expected Behavior:** shouold work the same way with models served by ollama or by litellm **Actual Behavior:** [Describe what actually happened.] ## Environment - **Open WebUI Version:v0.3.4 ## Reproduction Details **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x ] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1262