[GH-ISSUE #17038] feat: Web search functions should first be fed AI input before determining search results. #18149

Closed
opened 2026-04-20 00:19:28 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @yuliang615 on GitHub (Aug 29, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/17038

Check Existing Issues

  • I have searched the existing issues and discussions.

Problem Description

Current approach
Calling the search engine API for a whole section results in too many search criteria, missing the point, and fetching a bunch of irrelevant information.

Desired Solution you'd like

Adding functionality should include prompts like title generation.

User asks a question -> Determine if the user has enabled the web search function.

If yes, let LLM determine what to search for -> Call the search API -> Return the results to LLM for organized output.

If no, directly output the answer.

Alternatives Considered

No response

Additional Context

Web search functions should first be fed through AI input before determining what to search for.

For example, if you ask the AI ​​the following question:
"1+1 = 2
2+2 = 4
3+3 = 6
6+6 = 12
12+12 = 24
24+24 = 48
48+48 = 96
96+96 = how much
Please find Bitcoin market data," LLM will respond:
"Let me explain the math first:

96 + 96 = 192

Regarding your request for Bitcoin market information, I'm sorry, but the source link you provided currently displays a "403 Error," so I can't directly retrieve the latest data from that source.

However, based on my knowledge, the price of Bitcoin (BTC) fluctuates continuously, often influenced by supply and demand, market sentiment, global financial conditions, and regulatory policies." If you need real-time price recommendations, please check:

CoinMarketCap
Coingecko
Major exchanges (such as Binance and Coinbase)
Do you need real-time prices, or would you like me to explain Bitcoin market trends and analysis?

Current approach
Calling the search engine API for a whole section results in too many search criteria, missing the point, and fetching a bunch of irrelevant information.

The correct approach
Adding functionality should include prompts like title generation.

User asks a question -> Determine if the user has enabled the web search function.

If yes, let LLM determine what to search for -> Call the search API -> Return the results to LLM for organized output.

If no, directly output the answer.

Originally created by @yuliang615 on GitHub (Aug 29, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/17038 ### Check Existing Issues - [x] I have searched the existing issues and discussions. ### Problem Description Current approach Calling the search engine API for a whole section results in too many search criteria, missing the point, and fetching a bunch of irrelevant information. ### Desired Solution you'd like Adding functionality should include prompts like title generation. User asks a question -> Determine if the user has enabled the web search function. If yes, let LLM determine what to search for -> Call the search API -> Return the results to LLM for organized output. If no, directly output the answer. ### Alternatives Considered _No response_ ### Additional Context Web search functions should first be fed through AI input before determining what to search for. For example, if you ask the AI ​​the following question: "1+1 = 2 2+2 = 4 3+3 = 6 6+6 = 12 12+12 = 24 24+24 = 48 48+48 = 96 96+96 = how much Please find Bitcoin market data," LLM will respond: "Let me explain the math first: 96 + 96 = 192 ✅ Regarding your request for Bitcoin market information, I'm sorry, but the source link you provided currently displays a "403 Error," so I can't directly retrieve the latest data from that source. However, based on my knowledge, the price of Bitcoin (BTC) fluctuates continuously, often influenced by supply and demand, market sentiment, global financial conditions, and regulatory policies." If you need real-time price recommendations, please check: CoinMarketCap Coingecko Major exchanges (such as Binance and Coinbase) Do you need real-time prices, or would you like me to explain Bitcoin market trends and analysis? Current approach Calling the search engine API for a whole section results in too many search criteria, missing the point, and fetching a bunch of irrelevant information. The correct approach Adding functionality should include prompts like title generation. User asks a question -> Determine if the user has enabled the web search function. If yes, let LLM determine what to search for -> Call the search API -> Return the results to LLM for organized output. If no, directly output the answer.
Author
Owner

@yuliang615 commented on GitHub (Aug 29, 2025):

Added: If the entire call to the search API also causes the search results to take a very long time

<!-- gh-comment-id:3236874056 --> @yuliang615 commented on GitHub (Aug 29, 2025): Added: If the entire call to the search API also causes the search results to take a very long time
Author
Owner

@rgaricano commented on GitHub (Aug 29, 2025):

We can set the query search prompt template to instruct it as we prefer (admin/settings/interface Query Generation Prompt Template )

Default Query Generation Prompt:
2407d9b905/backend/open_webui/config.py (L1653-L1675)

<!-- gh-comment-id:3237128567 --> @rgaricano commented on GitHub (Aug 29, 2025): We can set the query search prompt template to instruct it as we prefer (admin/settings/interface Query Generation Prompt Template ) Default Query Generation Prompt: https://github.com/open-webui/open-webui/blob/2407d9b905978d68619bdce4021e424046ec8df9/backend/open_webui/config.py#L1653-L1675
Author
Owner

@yuliang615 commented on GitHub (Aug 29, 2025):

Ok Thanks I need to improve the problem I mentioned above by changing the prompt word here, right?

<!-- gh-comment-id:3237293936 --> @yuliang615 commented on GitHub (Aug 29, 2025): Ok Thanks I need to improve the problem I mentioned above by changing the prompt word here, right?
Author
Owner

@yuliang615 commented on GitHub (Aug 29, 2025):

I checked the open API log and found that the open web UI didn't send this request to the LLM. If this prompt was working, I would have seen it asked in the log.

<!-- gh-comment-id:3237344990 --> @yuliang615 commented on GitHub (Aug 29, 2025): I checked the open API log and found that the open web UI didn't send this request to the LLM. If this prompt was working, I would have seen it asked in the log.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#18149