mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 19:38:46 -05:00
[GH-ISSUE #6960] Model loose document context #30076
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @icemagno on GitHub (Nov 15, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/6960
Installation Method
Using Docker according to instructions on main github page
Environment
Open WebUI Version: v0.3.35
Ollama (if applicable): 0.4.1
Operating System: Windows 10
Browser (if applicable): Chrome
Confirmation:
Expected Behavior:
When I create a collection and attach it to a model, I expect to have the model to answer all my questions ( all the time ) based on that collection or any of it files
Actual Behavior:
After the first question, the model looses completely the collection data. Saying things like "I don't have any information about it..."
Description
Bug Summary:
Already told it on "Actual Behaviour"
Reproduction Details
Steps to Reproduce:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots/Screen Recordings (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Additional Information
[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
@icemagno commented on GitHub (Nov 15, 2024):
Seems I'm loosing the context because its size. I think it is giving preference for the chat context instead the document (or collection). So it throw alway the document and start to look only to what we chat before. As I progress on my questions, more it turns crazy and start to say nonsenses. My local ollama have the llama3 8b so I think it is not enough to support knowledge.