[GH-ISSUE #6960] Model loose document context #53214

Closed
opened 2026-05-05 14:27:23 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @icemagno on GitHub (Nov 15, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/6960

Installation Method

Using Docker according to instructions on main github page

Environment

  • Open WebUI Version: v0.3.35

  • Ollama (if applicable): 0.4.1

  • Operating System: Windows 10

  • Browser (if applicable): Chrome

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

When I create a collection and attach it to a model, I expect to have the model to answer all my questions ( all the time ) based on that collection or any of it files

Actual Behavior:

After the first question, the model looses completely the collection data. Saying things like "I don't have any information about it..."

Description

Bug Summary:
Already told it on "Actual Behaviour"

Reproduction Details

Steps to Reproduce:

  1. Create a collection.
  2. Upload one XML with a list of itens and some columns
  3. Create a new model based on Llama3.2
  4. Attach that collection to it
  5. Ask on a new chat about how many itens it found on that document
  6. Have the correct answer
  7. Ask again
  8. Have "I know nothing about it. Tell me where is the document you're talking about" as answer

Logs and Screenshots

Capturar

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots/Screen Recordings (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @icemagno on GitHub (Nov 15, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/6960 ## Installation Method Using Docker according to instructions on main github page ## Environment - **Open WebUI Version:** v0.3.35 - **Ollama (if applicable):** 0.4.1 - **Operating System:** Windows 10 - **Browser (if applicable):** Chrome **Confirmation:** - [ ] I have read and followed all the instructions provided in the README.md. - [ ] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. - [ ] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: When I create a collection and attach it to a model, I expect to have the model to answer all my questions ( all the time ) based on that collection or any of it files ## Actual Behavior: After the first question, the model looses completely the collection data. Saying things like "I don't have any information about it..." ## Description **Bug Summary:** Already told it on "Actual Behaviour" ## Reproduction Details **Steps to Reproduce:** 1. Create a collection. 2. Upload one XML with a list of itens and some columns 3. Create a new model based on Llama3.2 4. Attach that collection to it 5. Ask on a new chat about how many itens it found on that document 6. Have the correct answer 7. Ask again 8. Have "I know nothing about it. Tell me where is the document you're talking about" as answer ## Logs and Screenshots ![Capturar](https://github.com/user-attachments/assets/62a0f8b0-b0a0-4cb1-ac1b-8662c2d36ae6) **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** [Include relevant Docker container logs, if applicable] **Screenshots/Screen Recordings (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@icemagno commented on GitHub (Nov 15, 2024):

Seems I'm loosing the context because its size. I think it is giving preference for the chat context instead the document (or collection). So it throw alway the document and start to look only to what we chat before. As I progress on my questions, more it turns crazy and start to say nonsenses. My local ollama have the llama3 8b so I think it is not enough to support knowledge.

<!-- gh-comment-id:2477789236 --> @icemagno commented on GitHub (Nov 15, 2024): Seems I'm loosing the context because its size. I think it is giving preference for the chat context instead the document (or collection). So it throw alway the document and start to look only to what we chat before. As I progress on my questions, more it turns crazy and start to say nonsenses. My local ollama have the llama3 8b so I think it is not enough to support knowledge.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#53214