mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #17038] feat: Web search functions should first be fed AI input before determining search results. #33678
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @yuliang615 on GitHub (Aug 29, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/17038
Check Existing Issues
Problem Description
Current approach
Calling the search engine API for a whole section results in too many search criteria, missing the point, and fetching a bunch of irrelevant information.
Desired Solution you'd like
Adding functionality should include prompts like title generation.
User asks a question -> Determine if the user has enabled the web search function.
If yes, let LLM determine what to search for -> Call the search API -> Return the results to LLM for organized output.
If no, directly output the answer.
Alternatives Considered
No response
Additional Context
Web search functions should first be fed through AI input before determining what to search for.
For example, if you ask the AI the following question:
"1+1 = 2
2+2 = 4
3+3 = 6
6+6 = 12
12+12 = 24
24+24 = 48
48+48 = 96
96+96 = how much
Please find Bitcoin market data," LLM will respond:
"Let me explain the math first:
96 + 96 = 192 ✅
Regarding your request for Bitcoin market information, I'm sorry, but the source link you provided currently displays a "403 Error," so I can't directly retrieve the latest data from that source.
However, based on my knowledge, the price of Bitcoin (BTC) fluctuates continuously, often influenced by supply and demand, market sentiment, global financial conditions, and regulatory policies." If you need real-time price recommendations, please check:
CoinMarketCap
Coingecko
Major exchanges (such as Binance and Coinbase)
Do you need real-time prices, or would you like me to explain Bitcoin market trends and analysis?
Current approach
Calling the search engine API for a whole section results in too many search criteria, missing the point, and fetching a bunch of irrelevant information.
The correct approach
Adding functionality should include prompts like title generation.
User asks a question -> Determine if the user has enabled the web search function.
If yes, let LLM determine what to search for -> Call the search API -> Return the results to LLM for organized output.
If no, directly output the answer.
@yuliang615 commented on GitHub (Aug 29, 2025):
Added: If the entire call to the search API also causes the search results to take a very long time
@rgaricano commented on GitHub (Aug 29, 2025):
We can set the query search prompt template to instruct it as we prefer (admin/settings/interface Query Generation Prompt Template )
Default Query Generation Prompt:
2407d9b905/backend/open_webui/config.py (L1653-L1675)@yuliang615 commented on GitHub (Aug 29, 2025):
Ok Thanks I need to improve the problem I mentioned above by changing the prompt word here, right?
@yuliang615 commented on GitHub (Aug 29, 2025):
I checked the open API log and found that the open web UI didn't send this request to the LLM. If this prompt was working, I would have seen it asked in the log.