[GH-ISSUE #13443] issue: Web search has no effect on LLM output #16912

Closed
opened 2026-04-19 22:43:53 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @chrisspen on GitHub (May 2, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/13443

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

0.6.5

Ollama Version (if applicable)

0.6.7

Operating System

Ubuntu 24.04

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

Enabling "Web Search" should allow a model to search the web and use web results when formulating its response.

Actual Behavior

I enabled the "Web search" setting in the admin panel, selecting "google_pse" as the backend search engine to use. I created a Google Search API key, as well as a Google PSE Engine Id and entered them into Open-WebUI. I left the other settings at their defaults.

This did cause a "Web Search" button to show under my chat text input field. However, it seems to have no effect.

As a test, I tried asking, "What is the temperature in Tokyo today?" With web search button disabled, it responded with:

I currently don't have access to real-time weather data, so I can't provide the exact temperature in Tokyo for today. However, you can check the current temperature using a weather service like the **Japan Meteorological Agency (JMA)** website, **Weather.com**, or a weather app (e.g., Weather.com, AccuWeather, or the built-in weather app on your phone).

If you'd like, I can give you general information about Tokyo's typical temperatures for this time of year! Let me know.

Which is what I would expect. However, when I enabled the web search button, it says it searched 20 sites, and then responded with:

I currently don't have access to real-time weather data, so I can't provide the exact temperature in Tokyo for today. However, you can check the current temperature using a weather service like the **Japan Meteorological Agency (JMA)** website, **Weather.com**, or a weather app (e.g., Weather.com, AccuWeather, or the built-in weather app on your phone).

If you'd like, I can give you general information about Tokyo's typical temperatures for this time of year! Let me know.

So it's response was essentially identical, except it wasted a lot additional time searching the web. As the temperature in Tokyo is easily something you can find with a Google search, I can only assume the web results are not being forwarded to the LLM when it formulates its response. However, it's unclear what additional setting I need to change in order to fix this.

Steps to Reproduce

  1. Go to /admin/settings
  2. Click on Web Search
  3. Toogle "Web Search" to "on".
  4. Select "google_pse" as the search engine and enter your API key and PSE Engine ID.
  5. Click Save.
  6. Open a new chat.
  7. Enable the "Web Search" button below the text input field.
  8. Ask a question the LLM should only be able to find by searching the web.

Logs & Screenshots

Image

Additional Information

No response

Originally created by @chrisspen on GitHub (May 2, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/13443 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version 0.6.5 ### Ollama Version (if applicable) 0.6.7 ### Operating System Ubuntu 24.04 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior Enabling "Web Search" should allow a model to search the web and use web results when formulating its response. ### Actual Behavior I enabled the "Web search" setting in the admin panel, selecting "google_pse" as the backend search engine to use. I created a Google Search API key, as well as a Google PSE Engine Id and entered them into Open-WebUI. I left the other settings at their defaults. This did cause a "Web Search" button to show under my chat text input field. However, it seems to have no effect. As a test, I tried asking, "What is the temperature in Tokyo today?" With web search button disabled, it responded with: I currently don't have access to real-time weather data, so I can't provide the exact temperature in Tokyo for today. However, you can check the current temperature using a weather service like the **Japan Meteorological Agency (JMA)** website, **Weather.com**, or a weather app (e.g., Weather.com, AccuWeather, or the built-in weather app on your phone). If you'd like, I can give you general information about Tokyo's typical temperatures for this time of year! Let me know. Which is what I would expect. However, when I enabled the web search button, it says it searched 20 sites, and then responded with: I currently don't have access to real-time weather data, so I can't provide the exact temperature in Tokyo for today. However, you can check the current temperature using a weather service like the **Japan Meteorological Agency (JMA)** website, **Weather.com**, or a weather app (e.g., Weather.com, AccuWeather, or the built-in weather app on your phone). If you'd like, I can give you general information about Tokyo's typical temperatures for this time of year! Let me know. So it's response was essentially identical, except it wasted a lot additional time searching the web. As the temperature in Tokyo is easily something you can find with a Google search, I can only assume the web results are not being forwarded to the LLM when it formulates its response. However, it's unclear what additional setting I need to change in order to fix this. ### Steps to Reproduce 1. Go to /admin/settings 2. Click on Web Search 3. Toogle "Web Search" to "on". 4. Select "google_pse" as the search engine and enter your API key and PSE Engine ID. 5. Click Save. 6. Open a new chat. 7. Enable the "Web Search" button below the text input field. 8. Ask a question the LLM should only be able to find by searching the web. ### Logs & Screenshots ![Image](https://github.com/user-attachments/assets/548512d9-5f92-46f6-8274-9fef33113b9a) ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-19 22:43:53 -05:00
Author
Owner

@geiseri commented on GitHub (May 3, 2025):

I see this quite a bit. Usually a regenerate will cause it to perform the search. Sometimes a 3rd time since it seems to only populate the RAG sometimes.

<!-- gh-comment-id:2848410961 --> @geiseri commented on GitHub (May 3, 2025): I see this quite a bit. Usually a regenerate will cause it to perform the search. Sometimes a 3rd time since it seems to only populate the RAG sometimes.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#16912