[GH-ISSUE #21299] issue: Knowledge attached to a model isn't used anymore #58100

Closed
opened 2026-05-05 22:19:43 -05:00 by GiteaMirror · 31 comments
Owner

Originally created by @normen on GitHub (Feb 10, 2026).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/21299

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

0.7.2

Ollama Version (if applicable)

No response

Operating System

Ubuntu 22

Browser (if applicable)

Firefox

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

When creating a model and attaching a knowledge base to that model I expect the model to automatically perform RAG on that knowledge base, even if its called via the API, which was the case until a few versions ago.

Actual Behavior

The knowledge base RAG is only used when manually adding the knowledge base to any chat, not when selecting a model with configured knowledge base.

Steps to Reproduce

  1. Create a knowledge base with some data
  2. Create a new model
  3. Attach the knowledge base to the model
  4. Configure the rest of the model
  5. Save the model
  6. Talk to model about knowledge in knowledge base
  7. The model is oblivious
  8. Add knowledge base to chat directly
  9. The model gets RAG information and can see the knowledge base data

Logs & Screenshots

No logs as there is no errors.

Additional Information

No response

Originally created by @normen on GitHub (Feb 10, 2026). Original GitHub issue: https://github.com/open-webui/open-webui/issues/21299 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version 0.7.2 ### Ollama Version (if applicable) _No response_ ### Operating System Ubuntu 22 ### Browser (if applicable) Firefox ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior When creating a model and attaching a knowledge base to that model I expect the model to automatically perform RAG on that knowledge base, even if its called via the API, which was the case until a few versions ago. ### Actual Behavior The knowledge base RAG is only used when manually adding the knowledge base to any chat, not when selecting a model with configured knowledge base. ### Steps to Reproduce 1. Create a knowledge base with some data 2. Create a new model 3. Attach the knowledge base to the model 4. Configure the rest of the model 5. Save the model 6. Talk to model about knowledge in knowledge base 7. The model is oblivious 8. Add knowledge base to chat directly 9. The model gets RAG information and can see the knowledge base data ### Logs & Screenshots No logs as there is no errors. ### Additional Information _No response_
GiteaMirror added the bug label 2026-05-05 22:19:43 -05:00
Author
Owner

@Classic298 commented on GitHub (Feb 10, 2026):

Is this even an issue? I am pretty sure it's solved in dev if i read it correctly.

And what is your proposed solution? Your message is empty "AI" gentsy

<!-- gh-comment-id:3880698821 --> @Classic298 commented on GitHub (Feb 10, 2026): Is this even an issue? I am pretty sure it's solved in dev if i read it correctly. And what is your proposed solution? Your message is empty "AI" gentsy
Author
Owner

@normen commented on GitHub (Feb 11, 2026):

Is this even an issue? I am pretty sure it's solved in dev if i read it correctly.

And what is your proposed solution? Your message is empty "AI" gentsy

I guess its an issue, why would you still be able to attach a knowledge base to a model if its never used? I suppose this is a regression from when the knowledge base revamp and the internal functions were added.

I don't use the dev version, if its true that knowledge bases attached to models are working again then all the better.

<!-- gh-comment-id:3885025143 --> @normen commented on GitHub (Feb 11, 2026): > Is this even an issue? I am pretty sure it's solved in dev if i read it correctly. > > And what is your proposed solution? Your message is empty "AI" gentsy I guess its an issue, why would you still be able to attach a knowledge base to a model if its never used? I suppose this is a regression from when the knowledge base revamp and the internal functions were added. I don't use the dev version, if its true that knowledge bases attached to models are working again then all the better.
Author
Owner

@Classic298 commented on GitHub (Feb 11, 2026):

AI gentsy deleted his message. alright

I guess its an issue, why would you still be able to attach a knowledge base to a model if its never used? I suppose this is a regression from when the knowledge base revamp and the internal functions were added.

Yes yes - of course it's an issue on main, haha, but yeah by "is this even an issue" i did mean whether it was still an issue.
Ah wait i remember something

I think the PR for it that fixes it is still open

Keeping this open for tracking purposes. But should be solved soon.

<!-- gh-comment-id:3885085844 --> @Classic298 commented on GitHub (Feb 11, 2026): AI gentsy deleted his message. alright > I guess its an issue, why would you still be able to attach a knowledge base to a model if its never used? I suppose this is a regression from when the knowledge base revamp and the internal functions were added. Yes yes - of course it's an issue on main, haha, but yeah by "is this even an issue" i did mean whether it was *still* an issue. Ah wait i remember something I think the PR for it that fixes it is still open Keeping this open for tracking purposes. But should be solved soon.
Author
Owner

@normen commented on GitHub (Feb 15, 2026):

The latest version 0.8.1 still has this issue, knowledge bases attached to a model are not used.

<!-- gh-comment-id:3904743645 --> @normen commented on GitHub (Feb 15, 2026): The latest version 0.8.1 still has this issue, knowledge bases attached to a model are not used.
Author
Owner

@Classic298 commented on GitHub (Feb 15, 2026):

Yes. The PR was not merged yet.

<!-- gh-comment-id:3904759847 --> @Classic298 commented on GitHub (Feb 15, 2026): Yes. The PR was not merged yet.
Author
Owner

@MacJedi42 commented on GitHub (Feb 19, 2026):

Forgive me if this is unrelated, but I have noticed that even on 0.8.3 (dev branch as of writing also) attached files no longer have any content inside of them, my model can't get data from inside of a text file. When I attach a .txt file directly to the chat with a chat message the model informs me that it can't see the content / can't see the file attached. When I click the file in the UI interface, even though the file shows it's 1.8KB in size the content shows as none.

Image

Hopefully this is part of the same problem, as I suspect the Knowledge retrieval uses the same mechanism in the backend as file attachments.

<!-- gh-comment-id:3924429142 --> @MacJedi42 commented on GitHub (Feb 19, 2026): Forgive me if this is unrelated, but I have noticed that even on 0.8.3 (dev branch as of writing also) attached files no longer have any content inside of them, my model can't get data from inside of a text file. When I attach a .txt file directly to the chat with a chat message the model informs me that it can't see the content / can't see the file attached. When I click the file in the UI interface, even though the file shows it's 1.8KB in size the content shows as none. <img width="1091" height="319" alt="Image" src="https://github.com/user-attachments/assets/075b1f1a-7287-478d-95ac-9eeba7217479" /> Hopefully this is part of the same problem, as I suspect the Knowledge retrieval uses the same mechanism in the backend as file attachments.
Author
Owner

@Classic298 commented on GitHub (Feb 19, 2026):

yes this is unrelated

<!-- gh-comment-id:3925405804 --> @Classic298 commented on GitHub (Feb 19, 2026): yes this is unrelated
Author
Owner

@n4gY1 commented on GitHub (Feb 20, 2026):

I also noticed this problem. If I give the created model a knowledge base, it only wants to answer from there, if it doesn't find a match, I get the answer "I can't answer based on the documents". If I give the system prompt to answer anyway, it doesn't work either. As soon as I take away the knowledge base collections from it, it answers, for example, the pancake recipe...

<!-- gh-comment-id:3936250909 --> @n4gY1 commented on GitHub (Feb 20, 2026): I also noticed this problem. If I give the created model a knowledge base, it only wants to answer from there, if it doesn't find a match, I get the answer "I can't answer based on the documents". If I give the system prompt to answer anyway, it doesn't work either. As soon as I take away the knowledge base collections from it, it answers, for example, the pancake recipe...
Author
Owner

@Classic298 commented on GitHub (Feb 20, 2026):

@n4gY1 intended. If you give the model a knowledge base, you are limiting it to THAT knowledge base. If you want it to be able to access all knowledge bases, then do not add any any let it just use it's builtin tools. If you want it to only access CERTAIN knowledge bases, then add those.

This is absolutely intended and is meant for people to be able to create models with access to only a limited number of knowledge bases for certain tasks or to protect other information from leaking for example.

<!-- gh-comment-id:3936261675 --> @Classic298 commented on GitHub (Feb 20, 2026): @n4gY1 intended. If you give the model a knowledge base, you are limiting it to THAT knowledge base. If you want it to be able to access all knowledge bases, then do not add any any let it just use it's builtin tools. If you want it to only access CERTAIN knowledge bases, then add those. This is absolutely intended and is meant for people to be able to create models with access to only a limited number of knowledge bases for certain tasks or to protect other information from leaking for example.
Author
Owner

@Classic298 commented on GitHub (Feb 20, 2026):

@n4gY1 also what you are describing is different to OP's report.

OP described that the model did not see the knowledge base that is attached to the model at all.

<!-- gh-comment-id:3936266471 --> @Classic298 commented on GitHub (Feb 20, 2026): @n4gY1 also what you are describing is different to OP's report. OP described that the model did not see the knowledge base that is attached to the model **at all**.
Author
Owner

@Classic298 commented on GitHub (Feb 20, 2026):

btw @normen is your issue still true for you with 0.8.3?

If yes, with or without native tool calling?

<!-- gh-comment-id:3936268548 --> @Classic298 commented on GitHub (Feb 20, 2026): btw @normen is your issue still true for you with 0.8.3? If yes, with or **without native tool calling**?
Author
Owner

@normen commented on GitHub (Feb 20, 2026):

btw @normen is your issue still true for you with 0.8.3?

If yes, with or without native tool calling?

The issue disappears when native tool calling is disabled, knowledge bases are queried and added as RAG - but then function calling is hit and miss, as expected.

Edit: And yes, its still happening in 0.8.3

<!-- gh-comment-id:3936386924 --> @normen commented on GitHub (Feb 20, 2026): > btw [@normen](https://github.com/normen) is your issue still true for you with 0.8.3? > > If yes, with or **without native tool calling**? The issue disappears when native tool calling is disabled, knowledge bases are queried and added as RAG - but then function calling is hit and miss, as expected. Edit: And yes, its still happening in 0.8.3
Author
Owner

@Classic298 commented on GitHub (Feb 20, 2026):

@normen wait please specify. Is the model ABLE to find and use the knowledge base when native tool calling is used?

If yes, I don't think we have a bug here.
That's just the model not using the correct tools or using the tools the correct way

<!-- gh-comment-id:3936432749 --> @Classic298 commented on GitHub (Feb 20, 2026): @normen wait please specify. Is the model ABLE to find and use the knowledge base when native tool calling is used? If yes, I don't think we have a bug here. That's just the model not using the correct tools or using the tools the correct way
Author
Owner

@normen commented on GitHub (Feb 20, 2026):

@normen wait please specify. Is the model ABLE to find and use the knowledge base when native tool calling is used?

If yes, I don't think we have a bug here. That's just the model not using the correct tools or using the tools the correct way

No, in native mode, when I instruct the model to "use the knowledge base" it only manages to do that when I have the "built-in tools" enabled, by doing a search over ALL knowledge bases. When I only add the knowledge base without "built-in tools" it doesn't get access to the knowledge base.

And in general I also would expect this (i.e. adding a KB to a model) to simply always do RAG instead of adding tools for the model, thats kind of the point for me.

<!-- gh-comment-id:3936498373 --> @normen commented on GitHub (Feb 20, 2026): > [@normen](https://github.com/normen) wait please specify. Is the model ABLE to find and use the knowledge base when native tool calling is used? > > If yes, I don't think we have a bug here. That's just the model not using the correct tools or using the tools the correct way No, in native mode, when I instruct the model to "use the knowledge base" it only manages to do that when I have the "built-in tools" enabled, by doing a search over ALL knowledge bases. When I only add the knowledge base without "built-in tools" it doesn't get access to the knowledge base. And in general I also would expect this (i.e. adding a KB to a model) to simply always do RAG instead of adding tools for the model, thats kind of the point for me.
Author
Owner

@Classic298 commented on GitHub (Feb 20, 2026):

@normen ok but then in this case what you are describing is intended behaviour.

If you disable built in tools, you are taking the model's ability to query the attached knowledge bases and therefore you aren't finding anything.

In native mode, RAG is invoked by the model.
In default mode, RAG is invoked by open webui

if you use native mode, but remove the model's tools to query the KBs, then it will not work.

You can granuarly adjust which tools the model has access to, so entirely disabling the built in tools is a configuration mistake here.

thanks for double checking and explaining!
closing.

<!-- gh-comment-id:3936532871 --> @Classic298 commented on GitHub (Feb 20, 2026): @normen ok but then in this case what you are describing is intended behaviour. If you disable built in tools, you are taking the model's ability to query the attached knowledge bases and therefore you aren't finding anything. In native mode, RAG is invoked by the model. In default mode, RAG is invoked by open webui if you use native mode, but remove the model's tools to query the KBs, then it will not work. You can granuarly adjust which tools the model has access to, so entirely disabling the built in tools is a configuration mistake here. thanks for double checking and explaining! closing.
Author
Owner

@Classic298 commented on GitHub (Feb 20, 2026):

(I also originally read your issue wrong - right now there IS an issue that the knowledge base isn't attached even if in full context mode, where there's an open PR for it.)

<!-- gh-comment-id:3936535394 --> @Classic298 commented on GitHub (Feb 20, 2026): (I also originally read your issue wrong - right now there IS an issue that the knowledge base isn't attached even if in full context mode, where there's an open PR for it.)
Author
Owner

@normen commented on GitHub (Feb 20, 2026):

@normen ok but then in this case what you are describing is intended behaviour.

If you disable built in tools, you are taking the model's ability to query the attached knowledge bases and therefore you aren't finding anything.

In native mode, RAG is invoked by the model. In default mode, RAG is invoked by open webui

if you use native mode, but remove the model's tools to query the KBs, then it will not work.

You can granuarly adjust which tools the model has access to, so entirely disabling the built in tools is a configuration mistake here.

thanks for double checking and explaining! closing.

Well I don't want to get hung up on terminology but I thought of the "Retrieval" part of RAG to be outside of the model. If "Knowledge" attached to a model is now supposed to limit the access to knowledge bases when native mode is enabled and do RAG when native mode is disabled I guess some more info in the UI is in order as this is quite confusing, especially as it worked differently before.

Thanks for checking back though!

<!-- gh-comment-id:3936576411 --> @normen commented on GitHub (Feb 20, 2026): > [@normen](https://github.com/normen) ok but then in this case what you are describing is intended behaviour. > > If you disable built in tools, you are taking the model's ability to query the attached knowledge bases and therefore you aren't finding anything. > > In native mode, RAG is invoked by the model. In default mode, RAG is invoked by open webui > > if you use native mode, but remove the model's tools to query the KBs, then it will not work. > > You can granuarly adjust which tools the model has access to, so entirely disabling the built in tools is a configuration mistake here. > > thanks for double checking and explaining! closing. Well I don't want to get hung up on terminology but I thought of the "Retrieval" part of RAG to be outside of the model. If "Knowledge" attached to a model is now supposed to limit the access to knowledge bases when native mode is enabled and do RAG when native mode is disabled I guess some more info in the UI is in order as this is quite confusing, especially as it worked differently before. Thanks for checking back though!
Author
Owner

@Classic298 commented on GitHub (Feb 20, 2026):

@normen I can understand why that could be confusing

We are moving away from classical RAG where you generate queries once and feed the model once.

Now RAG becomes agentic. Agentic RAG.
The model autonomously calls the knowledge when it's actually needed.
And if the first search didn't yield good results, it can try again (which classic RAG could not do)

This is why when using native tool calling mode, you are turning on agentic RAG also - but disabling the RAG tools (builtin tools) is then handicapping the model

<!-- gh-comment-id:3936844186 --> @Classic298 commented on GitHub (Feb 20, 2026): @normen I can understand why that could be confusing We are moving away from classical RAG where you generate queries once and feed the model once. Now RAG becomes agentic. Agentic RAG. The model autonomously calls the knowledge when it's actually needed. And if the first search didn't yield good results, it can try again (which classic RAG could not do) This is why when using native tool calling mode, you are turning on agentic RAG also - but disabling the RAG tools (builtin tools) is then handicapping the model
Author
Owner

@Classic298 commented on GitHub (Feb 20, 2026):

I will update the docs here to be more clear on this topic

<!-- gh-comment-id:3936848194 --> @Classic298 commented on GitHub (Feb 20, 2026): I will update the docs here to be more clear on this topic
Author
Owner

@normen commented on GitHub (Feb 20, 2026):

@normen I can understand why that could be confusing

We are moving away from classical RAG where you generate queries once and feed the model once.

Now RAG becomes agentic. Agentic RAG. The model autonomously calls the knowledge when it's actually needed. And if the first search didn't yield good results, it can try again (which classic RAG could not do)

This is why when using native tool calling mode, you are turning on agentic RAG also - but disabling the RAG tools (builtin tools) is then handicapping the model

Yeah, just for me personally, it all comes crashing down. I was relying on OpenWebUI supplying RAG while the actual requests and tools were coming from another WebUI that calls the OpenAI API of OpenWebUI.

This is broken now because disabling native tools in the model is necessary for (non-agentic) RAG to work but it breaks native tools on the API side.

<!-- gh-comment-id:3936981917 --> @normen commented on GitHub (Feb 20, 2026): > [@normen](https://github.com/normen) I can understand why that could be confusing > > We are moving away from classical RAG where you generate queries once and feed the model once. > > Now RAG becomes agentic. Agentic RAG. The model autonomously calls the knowledge when it's actually needed. And if the first search didn't yield good results, it can try again (which classic RAG could not do) > > This is why when using native tool calling mode, you are turning on agentic RAG also - but disabling the RAG tools (builtin tools) is then handicapping the model Yeah, just for me personally, it all comes crashing down. I was relying on OpenWebUI supplying RAG while the actual requests and tools were coming from another WebUI that calls the OpenAI API of OpenWebUI. This is broken now because disabling native tools in the model is necessary for (non-agentic) RAG to work but it breaks native tools on the API side.
Author
Owner

@arslancloud commented on GitHub (Apr 13, 2026):

I see the same issue: the base model has functionCall set to 'native', but the custom model uses 'standard' (non-native function calling), so it no longer uses the knowledge base. That doesn't seem correct. Should I open a new issue @Classic298 ?

<!-- gh-comment-id:4236864409 --> @arslancloud commented on GitHub (Apr 13, 2026): I see the same issue: the base model has functionCall set to 'native', but the custom model uses 'standard' (non-native function calling), so it no longer uses the knowledge base. That doesn't seem correct. Should I open a new issue @Classic298 ?
Author
Owner

@Classic298 commented on GitHub (Apr 13, 2026):

Open an issue for what? What is the problem exactly?

<!-- gh-comment-id:4236975146 --> @Classic298 commented on GitHub (Apr 13, 2026): Open an issue for what? What is the problem exactly?
Author
Owner

@arslancloud commented on GitHub (Apr 13, 2026):

Set Basemodel function calling set to native e.g. gpt 5.4
Costom model leave as standard with gpt 5.4 as Base model.
Custom model doesn't use the knowledge base anymore.
I don't think that it's meant be be like that?

<!-- gh-comment-id:4237726189 --> @arslancloud commented on GitHub (Apr 13, 2026): Set Basemodel function calling set to native e.g. gpt 5.4 Costom model leave as standard with gpt 5.4 as Base model. Custom model doesn't use the knowledge base anymore. I don't think that it's meant be be like that?
Author
Owner

@Classic298 commented on GitHub (Apr 13, 2026):

@arslancloud is the knowledge base directly attached to the model?

<!-- gh-comment-id:4238052984 --> @Classic298 commented on GitHub (Apr 13, 2026): @arslancloud is the knowledge base directly attached to the model?
Author
Owner

@Arslo-cloud commented on GitHub (Apr 13, 2026):

Yes it is directly attached to the model. The strange part is that if you attach the knowlagebase in the chat again, it will use it properly.

<!-- gh-comment-id:4238147871 --> @Arslo-cloud commented on GitHub (Apr 13, 2026): Yes it is directly attached to the model. The strange part is that if you attach the knowlagebase in the chat again, it will use it properly.
Author
Owner

@Classic298 commented on GitHub (Apr 13, 2026):

@Arslo-cloud @silentoplayz tried to internally reproduce this but couldnt reproduce. works here. can you try dev?

<!-- gh-comment-id:4239291839 --> @Classic298 commented on GitHub (Apr 13, 2026): @Arslo-cloud @silentoplayz tried to internally reproduce this but couldnt reproduce. works here. can you try dev?
Author
Owner

@Arslo-cloud commented on GitHub (Apr 14, 2026):

Did some more tests today. The problem seems to be that we are using response API on the base model. The problem also exist on dev but also just with responses API.

<!-- gh-comment-id:4243266778 --> @Arslo-cloud commented on GitHub (Apr 14, 2026): Did some more tests today. The problem seems to be that we are using response API on the base model. The problem also exist on dev but also just with responses API.
Author
Owner

@Arslo-cloud commented on GitHub (Apr 14, 2026):

Could it be due to:
https://github.com/open-webui/open-webui/issues/21340#issuecomment-4244369502 ?

<!-- gh-comment-id:4245586653 --> @Arslo-cloud commented on GitHub (Apr 14, 2026): Could it be due to: https://github.com/open-webui/open-webui/issues/21340#issuecomment-4244369502 ?
Author
Owner

@Classic298 commented on GitHub (Apr 14, 2026):

not sure. Could be. Dev has new features but also fixes for responses api. you can test dev branch if you like?

<!-- gh-comment-id:4246591686 --> @Classic298 commented on GitHub (Apr 14, 2026): not sure. Could be. Dev has new features but also fixes for responses api. you can test dev branch if you like?
Author
Owner

@silentoplayz commented on GitHub (Apr 14, 2026):

Did some more tests today. The problem seems to be that we are using response API on the base model. The problem also exist on dev but also just with responses API.

I'm unable to reproduce here on my end. I was provided an OpenAI API key to borrow for testing the reported issue here.

I added the connection in the Connection tab found within the Admin settings of Open WebUI.

Image

I used GPT 5.4 as the base model and toggled on Native function calling within the model's edit page in the Admin settings of Open WebUI.

Image

I then went to create a custom model in the Models section of the Workspace in Open WebUI and used gpt-5.4 as the "Base Model (From)" for this custom workspace model, leaving this model's advanced configuration settings all default. I even added a lengthy system prompt used for this custom workspace model.

Image Image

While still modifying this custom model, I attached a Knowledgebase Collection that contains 8 files to the custom model before saving it.

Image

Finally, I started a new chat with this custom model and asked it a question relevant enough for it provide me an answer related to the documents contained within the knowledgebase collection. The model was able to answer me with all 8 sources in its context.

Image
<!-- gh-comment-id:4246665377 --> @silentoplayz commented on GitHub (Apr 14, 2026): > Did some more tests today. The problem seems to be that we are using response API on the base model. The problem also exist on dev but also just with responses API. I'm unable to reproduce here on my end. I was provided an OpenAI API key to borrow for testing the reported issue here. I added the connection in the `Connection` tab found within the Admin settings of Open WebUI. <img width="595" height="596" alt="Image" src="https://github.com/user-attachments/assets/795dde19-af4d-4f78-9f81-ae11858cc838" /> I used GPT 5.4 as the base model and toggled on `Native` function calling within the model's edit page in the Admin settings of Open WebUI. <img width="2340" height="1280" alt="Image" src="https://github.com/user-attachments/assets/0c0ab381-cd9a-4b4b-a8b1-de96a9209263" /> I then went to create a custom model in the `Models` section of the `Workspace` in Open WebUI and used `gpt-5.4` as the "Base Model (From)" for this custom workspace model, leaving this model's advanced configuration settings all default. I even added a lengthy system prompt used for this custom workspace model. <img width="2330" height="292" alt="Image" src="https://github.com/user-attachments/assets/c81a90f1-55d8-4bad-8859-bafaa366ccaf" /> <img width="2362" height="1272" alt="Image" src="https://github.com/user-attachments/assets/331a7716-5b84-444c-b9a3-2b98b11196e3" /> While still modifying this custom model, I attached a **Knowledgebase Collection** that contains 8 files to the custom model before saving it. <img width="2333" height="523" alt="Image" src="https://github.com/user-attachments/assets/f8633eba-0870-42e7-8077-7a4cd5df322a" /> Finally, I started a new chat with this custom model and asked it a question relevant enough for it provide me an answer related to the documents contained within the knowledgebase collection. The model was able to answer me with all 8 sources in its context. <img width="1999" height="686" alt="Image" src="https://github.com/user-attachments/assets/9cf7eb2e-29bf-4df5-9292-350956e1d69f" />
Author
Owner

@Arslo-cloud commented on GitHub (Apr 17, 2026):

Could test a bit later than I was hoping. On the latest dev version all works fine. Thanks!

<!-- gh-comment-id:4271081541 --> @Arslo-cloud commented on GitHub (Apr 17, 2026): Could test a bit later than I was hoping. On the latest dev version all works fine. Thanks!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#58100