mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 19:38:46 -05:00
[GH-ISSUE #21299] issue: Knowledge attached to a model isn't used anymore #58100
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @normen on GitHub (Feb 10, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/21299
Check Existing Issues
Installation Method
Docker
Open WebUI Version
0.7.2
Ollama Version (if applicable)
No response
Operating System
Ubuntu 22
Browser (if applicable)
Firefox
Confirmation
README.md.Expected Behavior
When creating a model and attaching a knowledge base to that model I expect the model to automatically perform RAG on that knowledge base, even if its called via the API, which was the case until a few versions ago.
Actual Behavior
The knowledge base RAG is only used when manually adding the knowledge base to any chat, not when selecting a model with configured knowledge base.
Steps to Reproduce
Logs & Screenshots
No logs as there is no errors.
Additional Information
No response
@Classic298 commented on GitHub (Feb 10, 2026):
Is this even an issue? I am pretty sure it's solved in dev if i read it correctly.
And what is your proposed solution? Your message is empty "AI" gentsy
@normen commented on GitHub (Feb 11, 2026):
I guess its an issue, why would you still be able to attach a knowledge base to a model if its never used? I suppose this is a regression from when the knowledge base revamp and the internal functions were added.
I don't use the dev version, if its true that knowledge bases attached to models are working again then all the better.
@Classic298 commented on GitHub (Feb 11, 2026):
AI gentsy deleted his message. alright
Yes yes - of course it's an issue on main, haha, but yeah by "is this even an issue" i did mean whether it was still an issue.
Ah wait i remember something
I think the PR for it that fixes it is still open
Keeping this open for tracking purposes. But should be solved soon.
@normen commented on GitHub (Feb 15, 2026):
The latest version 0.8.1 still has this issue, knowledge bases attached to a model are not used.
@Classic298 commented on GitHub (Feb 15, 2026):
Yes. The PR was not merged yet.
@MacJedi42 commented on GitHub (Feb 19, 2026):
Forgive me if this is unrelated, but I have noticed that even on 0.8.3 (dev branch as of writing also) attached files no longer have any content inside of them, my model can't get data from inside of a text file. When I attach a .txt file directly to the chat with a chat message the model informs me that it can't see the content / can't see the file attached. When I click the file in the UI interface, even though the file shows it's 1.8KB in size the content shows as none.
Hopefully this is part of the same problem, as I suspect the Knowledge retrieval uses the same mechanism in the backend as file attachments.
@Classic298 commented on GitHub (Feb 19, 2026):
yes this is unrelated
@n4gY1 commented on GitHub (Feb 20, 2026):
I also noticed this problem. If I give the created model a knowledge base, it only wants to answer from there, if it doesn't find a match, I get the answer "I can't answer based on the documents". If I give the system prompt to answer anyway, it doesn't work either. As soon as I take away the knowledge base collections from it, it answers, for example, the pancake recipe...
@Classic298 commented on GitHub (Feb 20, 2026):
@n4gY1 intended. If you give the model a knowledge base, you are limiting it to THAT knowledge base. If you want it to be able to access all knowledge bases, then do not add any any let it just use it's builtin tools. If you want it to only access CERTAIN knowledge bases, then add those.
This is absolutely intended and is meant for people to be able to create models with access to only a limited number of knowledge bases for certain tasks or to protect other information from leaking for example.
@Classic298 commented on GitHub (Feb 20, 2026):
@n4gY1 also what you are describing is different to OP's report.
OP described that the model did not see the knowledge base that is attached to the model at all.
@Classic298 commented on GitHub (Feb 20, 2026):
btw @normen is your issue still true for you with 0.8.3?
If yes, with or without native tool calling?
@normen commented on GitHub (Feb 20, 2026):
The issue disappears when native tool calling is disabled, knowledge bases are queried and added as RAG - but then function calling is hit and miss, as expected.
Edit: And yes, its still happening in 0.8.3
@Classic298 commented on GitHub (Feb 20, 2026):
@normen wait please specify. Is the model ABLE to find and use the knowledge base when native tool calling is used?
If yes, I don't think we have a bug here.
That's just the model not using the correct tools or using the tools the correct way
@normen commented on GitHub (Feb 20, 2026):
No, in native mode, when I instruct the model to "use the knowledge base" it only manages to do that when I have the "built-in tools" enabled, by doing a search over ALL knowledge bases. When I only add the knowledge base without "built-in tools" it doesn't get access to the knowledge base.
And in general I also would expect this (i.e. adding a KB to a model) to simply always do RAG instead of adding tools for the model, thats kind of the point for me.
@Classic298 commented on GitHub (Feb 20, 2026):
@normen ok but then in this case what you are describing is intended behaviour.
If you disable built in tools, you are taking the model's ability to query the attached knowledge bases and therefore you aren't finding anything.
In native mode, RAG is invoked by the model.
In default mode, RAG is invoked by open webui
if you use native mode, but remove the model's tools to query the KBs, then it will not work.
You can granuarly adjust which tools the model has access to, so entirely disabling the built in tools is a configuration mistake here.
thanks for double checking and explaining!
closing.
@Classic298 commented on GitHub (Feb 20, 2026):
(I also originally read your issue wrong - right now there IS an issue that the knowledge base isn't attached even if in full context mode, where there's an open PR for it.)
@normen commented on GitHub (Feb 20, 2026):
Well I don't want to get hung up on terminology but I thought of the "Retrieval" part of RAG to be outside of the model. If "Knowledge" attached to a model is now supposed to limit the access to knowledge bases when native mode is enabled and do RAG when native mode is disabled I guess some more info in the UI is in order as this is quite confusing, especially as it worked differently before.
Thanks for checking back though!
@Classic298 commented on GitHub (Feb 20, 2026):
@normen I can understand why that could be confusing
We are moving away from classical RAG where you generate queries once and feed the model once.
Now RAG becomes agentic. Agentic RAG.
The model autonomously calls the knowledge when it's actually needed.
And if the first search didn't yield good results, it can try again (which classic RAG could not do)
This is why when using native tool calling mode, you are turning on agentic RAG also - but disabling the RAG tools (builtin tools) is then handicapping the model
@Classic298 commented on GitHub (Feb 20, 2026):
I will update the docs here to be more clear on this topic
@normen commented on GitHub (Feb 20, 2026):
Yeah, just for me personally, it all comes crashing down. I was relying on OpenWebUI supplying RAG while the actual requests and tools were coming from another WebUI that calls the OpenAI API of OpenWebUI.
This is broken now because disabling native tools in the model is necessary for (non-agentic) RAG to work but it breaks native tools on the API side.
@arslancloud commented on GitHub (Apr 13, 2026):
I see the same issue: the base model has functionCall set to 'native', but the custom model uses 'standard' (non-native function calling), so it no longer uses the knowledge base. That doesn't seem correct. Should I open a new issue @Classic298 ?
@Classic298 commented on GitHub (Apr 13, 2026):
Open an issue for what? What is the problem exactly?
@arslancloud commented on GitHub (Apr 13, 2026):
Set Basemodel function calling set to native e.g. gpt 5.4
Costom model leave as standard with gpt 5.4 as Base model.
Custom model doesn't use the knowledge base anymore.
I don't think that it's meant be be like that?
@Classic298 commented on GitHub (Apr 13, 2026):
@arslancloud is the knowledge base directly attached to the model?
@Arslo-cloud commented on GitHub (Apr 13, 2026):
Yes it is directly attached to the model. The strange part is that if you attach the knowlagebase in the chat again, it will use it properly.
@Classic298 commented on GitHub (Apr 13, 2026):
@Arslo-cloud @silentoplayz tried to internally reproduce this but couldnt reproduce. works here. can you try dev?
@Arslo-cloud commented on GitHub (Apr 14, 2026):
Did some more tests today. The problem seems to be that we are using response API on the base model. The problem also exist on dev but also just with responses API.
@Arslo-cloud commented on GitHub (Apr 14, 2026):
Could it be due to:
https://github.com/open-webui/open-webui/issues/21340#issuecomment-4244369502 ?
@Classic298 commented on GitHub (Apr 14, 2026):
not sure. Could be. Dev has new features but also fixes for responses api. you can test dev branch if you like?
@silentoplayz commented on GitHub (Apr 14, 2026):
I'm unable to reproduce here on my end. I was provided an OpenAI API key to borrow for testing the reported issue here.
I added the connection in the
Connectiontab found within the Admin settings of Open WebUI.I used GPT 5.4 as the base model and toggled on
Nativefunction calling within the model's edit page in the Admin settings of Open WebUI.I then went to create a custom model in the
Modelssection of theWorkspacein Open WebUI and usedgpt-5.4as the "Base Model (From)" for this custom workspace model, leaving this model's advanced configuration settings all default. I even added a lengthy system prompt used for this custom workspace model.While still modifying this custom model, I attached a Knowledgebase Collection that contains 8 files to the custom model before saving it.
Finally, I started a new chat with this custom model and asked it a question relevant enough for it provide me an answer related to the documents contained within the knowledgebase collection. The model was able to answer me with all 8 sources in its context.
@Arslo-cloud commented on GitHub (Apr 17, 2026):
Could test a bit later than I was hoping. On the latest dev version all works fine. Thanks!