mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
[GH-ISSUE #23362] Bug: Notes added to folders are not treated as context #58627
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @sugoidesune on GitHub (Apr 3, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/23362
Problem Description
On a general level I would like the LLM in a Folder (a specific Project) to have a shared default context (like Knowledge) and be able to edit that shared context.
Use cases are:
Desired Solution you'd like
Being able to add Notes to a Folder would by default give all chats in that Folder these notes as context and as it is already allow the LLM to edit it.
The benefit of using Notes is that they are already made to be easy to inspect and edit by humans and LLMs.
Adding them to Folders would allow them to function as Folder based interactive "Knowledge and Deliverables"
Alternatives Considered
Additional Context
No response
@silentoplayz commented on GitHub (Apr 3, 2026):
Notes can already be added to a folder, no?
@sugoidesune commented on GitHub (Apr 3, 2026):
❌ Then it might be a bug, because currently it does nothing for me.
I do have a note added, even as
entire documentwhich promises to attach the entire document to context...❌ but it is not actually added as context, it's just 'available' via tool use and so in this case it searches memories and doesn't even check the notes :
❌ You can of course system prompt it to force looking up notes always but that wastes a lot of API calls as I explained in the opening post:
❌❌ What's worse is that in a follow up message the note will no longer be part of the context, forcing us to make tool calls for each message. Doubling input token costs.
✅ The behavior is much more as expected when adding a note directly to the chat.
❌ Although I don't understand what and why it's running queries on a file I am providing as context 1:1
@frost19k commented on GitHub (Apr 4, 2026):
The trouble is that if you have a model in native tool mode and you attach a something as context for a folder then it might so happen that OWUI injects context and then the model decides to do a search anyway leading to context bloat.
However, if there's a model in native mode for whom you've disabled knowledgebase then (if you don't inject the folder context) it might as well be that the folder has no context. Model doesn't have the native tool nor does OWUI inject the context.
@sugoidesune commented on GitHub (Apr 4, 2026):
The case you described can already happen when attaching notes directly to the chat. The content is always added as context yet the LLM doesn't really know where that context is from and so might search and read the note again - #23392 addresses this.
#23392 in my case, helped the llm to avoid opening the already added note - only when for editing the note does it
view_noteagain 'to be sure' as it says - which can be mitigated with a single sentence prompt.It might of course search for other context, memories, knowledge, other notes but I that would be desired behavior.
Nevertheless we have almost all types of context but we lack
folder wide+dynamic persistentcontext.Attached File
(latest version is added to each message)
Static- can't be changed by the LLMDynamic- can be changed by the LLMPersistent- added as context to each messageEphemeral- context is only added to individual messages@jgill83 commented on GitHub (Apr 9, 2026):
Verifying this behavior in OWUI latest (0.8.12)
I’m seeing the same issue with Notes-based RAG/context not behaving as expected in folders.
In my setup, the folder clearly has Notes attached under Knowledge, but when I ask a chat in that folder something that should be answered from those notes, OWUI shows search_notes being explored and then returns no usable results. The assistant then says it has no stored notes or prior context, even though the folder contains relevant notes.
From the screenshots:
This makes it look like Notes added to a folder are not being surfaced as usable context for chats inside that folder, or that search_notes is not returning attached note content correctly.
Using Anthropic Sonnet 4.6 with Native tooling calls enabled and Builtin Tools with Notes enabled. Adding the notes directly to the model as knowledge also do not result in RAG context working correctly.
@jlgill commented on GitHub (Apr 10, 2026):
Update: Issue identified for
search_notesreturning empty on multi-word queriesAfter digging into this further, it looks like the behavior I reported (folder-attached notes not being found) may be related to two separate issues:
1.
search_notesSQL bug:The
search_notes()method inmodels/notes.pyconcatenates the entire multi-word query into one string after stripping all hyphens and spaces, then performs a singleILIKEsubstring match. Any query containing spaces or hyphens (e.g.,"term-one term-two term-three"becomes"termonetermtwotermthree") always returns[]because that concatenated string never appears contiguously in any note content. I've opened a separate issue for this with a fix ready: #235652. Native FC folder knowledge design:
When a model uses native function calling, notes attached to a folder are not auto-injected into the prompt. Instead, they're made available via builtin tools (
list_knowledge,query_knowledge_files,view_note). The LLM must call those tools to access them. In my case, the LLM chosesearch_notesinstead -- which hit the bug above -- making it appear as though folder notes weren't working at all.Disclosure: I did use AI tooling (GitHub Copilot) to assist with code analysis and drafting the fix. Changes were validated by manual human review, direct PostgreSQL queries, and confirming the SQL behavior before and after the fix, and end-to-end testing inside the running container to verify that multi-word queries now return the expected notes via the
search_notesnative function call.