[GH-ISSUE #138] Support for local RAG pipelines #11951

Closed
opened 2026-04-19 18:39:41 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @manojkr19 on GitHub (Nov 24, 2023).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/138

I've several local RAG pipelines with different index, retrieval strategies and tested against specific model. Would like to point this UI to those pipelines via custom APIs (similar to Ollama api with reverse proxy). An option of API key for authentication would be great.

Originally created by @manojkr19 on GitHub (Nov 24, 2023). Original GitHub issue: https://github.com/open-webui/open-webui/issues/138 I've several local RAG pipelines with different index, retrieval strategies and tested against specific model. Would like to point this UI to those pipelines via custom APIs (similar to Ollama api with reverse proxy). An option of API key for authentication would be great.
Author
Owner

@tjbck commented on GitHub (Nov 24, 2023):

Hi, Thanks for the suggestion but I'm not exactly sure what you meant. Could you rephrase your wording and provide us with example use cases? Thanks!

<!-- gh-comment-id:1826103775 --> @tjbck commented on GitHub (Nov 24, 2023): Hi, Thanks for the suggestion but I'm not exactly sure what you meant. Could you rephrase your wording and provide us with example use cases? Thanks!
Author
Owner

@manojkr19 commented on GitHub (Nov 24, 2023):

we have APIs that takes user prompt, retrieves local document and creates the prompt with local knowledge and sends request to Open AI or local models etc. Basically tons of prompt engineering wrapped around the model. Would like this UI to point to those APIs as a chat interface. Does this help. You can assume it's very similar to calling the Ollama API with the user chat history and retrieving the response and showing it to the user, but instead of directly talking to the model API, the UI would talk to a custom API.

<!-- gh-comment-id:1826109911 --> @manojkr19 commented on GitHub (Nov 24, 2023): we have APIs that takes user prompt, retrieves local document and creates the prompt with local knowledge and sends request to Open AI or local models etc. Basically tons of prompt engineering wrapped around the model. Would like this UI to point to those APIs as a chat interface. Does this help. You can assume it's very similar to calling the Ollama API with the user chat history and retrieving the response and showing it to the user, but instead of directly talking to the model API, the UI would talk to a custom API.
Author
Owner

@tjbck commented on GitHub (Dec 5, 2023):

I'm actively working on a built-in RAG feature for the webui, once that's completed I'll try to implement your request feature if it aligns with the webui, Let's continue our discussion here: #31. Thanks!

<!-- gh-comment-id:1839966129 --> @tjbck commented on GitHub (Dec 5, 2023): I'm actively working on a built-in RAG feature for the webui, once that's completed I'll try to implement your request feature if it aligns with the webui, Let's continue our discussion here: #31. Thanks!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#11951