mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
Impossible to use the RAG #1640
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Domi31tls on GitHub (Jul 30, 2024).
Bug Report
Description
When I provide a collection and ask a question, it responds that it doesn't know. I have tried many times, modifying my prompts, the embeddings, etc... nothing happens, the documents don't seem to be loaded.
Bug Summary:

Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]
Expected Behavior:
[Describe what you expected to happen.]
Actual Behavior:
[Describe what actually happened.]
Environment
Open WebUI Version: Latest
Ollama (if applicable): latest
Ubuntu 24.04
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Installation Method
Docker
Additional Information
I test with pdf and text files.
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
@sir3mat commented on GitHub (Jul 30, 2024):
Did you have enabled the hybrid search? What is the provider chosen to expose the embedding model (sentence transformer or ollama)?
@justinh-rahb commented on GitHub (Jul 30, 2024):
What
num_ctx/"Context Size" are you running your model?@Domi31tls commented on GitHub (Jul 30, 2024):
Thank for you help,
I use ollama with chroma/all-minilm-l6-v2-f32.

@Domi31tls commented on GitHub (Jul 30, 2024):
I use thé default num_ctx.
@justinh-rahb commented on GitHub (Jul 30, 2024):
@Domi31tls for Ollama models this is only 2048 tokens, not a lot to work with. Increasing this value will vastly increase RAM usage though. There is a PR in Ollama that would seek to alleviate that burden:
https://github.com/ollama/ollama/pull/5894