issue: RAG Hybrid Search still broken #5829

Closed
opened 2025-11-11 16:35:02 -06:00 by GiteaMirror · 40 comments
Owner

Originally created by @dotmobo on GitHub (Jul 21, 2025).

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.6.18

Ollama Version (if applicable)

No response

Operating System

Ubuntu 22.04

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

Hybrid search with embedding and reranking should get data from the knowledge database of the model.
Broken since v0.6.16

Actual Behavior

Since v0.6.16, the hybrid search functionality (using an OpenAI embedding model and an OpenAI reranking model) has stopped working.

In v0.6.17, I saw in the changelog: "Hybrid Search Functionality Restored". But it still doesn't work for me.
Still broken in v0.6.18.

If I roll back to v0.6.15, it works fine.

Steps to Reproduce

  • Create a knowledge base

  • Create a model and add the knowledge base

  • Use the following RAG parameters

    RAG_EMBEDDING_ENGINE: "openai"
    RAG_OPENAI_API_BASE_URL: "https://my-openai-url"
    RAG_OPENAI_API_KEY: "s3Cr3t"
    RAG_EMBEDDING_MODEL: "nomic"
    CONTENT_EXTRACTION_ENGINE: "tika"
    TIKA_SERVER_URL: "https://my-tika-url"
    CHUNK_SIZE: "800"
    RAG_EMBEDDING_OPENAI_BATCH_SIZE: "64"
    RAG_TOP_K: "12"
    RAG_TOP_K_RERANKER: "6"
    RAG_RELEVANCE_THRESHOLD: "0.0"
    RAG_TEXT_SPLITTER: "character"
    RAG_FILE_MAX_SIZE: "1000"
    RAG_FILE_MAX_COUNT: "10"
    ENABLE_RAG_HYBRID_SEARCH: "true"
    RAG_RERANKING_ENGINE: "external"
    RAG_RERANKING_MODEL: "bge-reranker"
    RAG_EXTERNAL_RERANKER_URL: "https://my-openai-url/v1/rerank"
    RAG_EXTERNAL_RERANKER_API_KEY: "s3cr3t"
    
  • Ask a question to the model

Logs & Screenshots

I can see in the logs :

2025-07-21 08:55:49.781 | INFO | httpx._client:_send_single_request:1025 - HTTP Request: GET http://192.168.0.10:6333/collections/open-webui_knowledge/exists "HTTP/1.1 200 OK" - {}
2025-07-21 08:55:49.782 | INFO | open_webui.retrieval.utils:query_collection_with_hybrid_search:352 - Starting hybrid search for 1 query in 1 collection... - {}
2025-07-21 08:55:49.839 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 130.79.202.31:0 - "POST /api/chat/completions HTTP/1.1" 200 - {}

But no documents are retrieved.

Additional Information

No response

Originally created by @dotmobo on GitHub (Jul 21, 2025). ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.6.18 ### Ollama Version (if applicable) _No response_ ### Operating System Ubuntu 22.04 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior Hybrid search with embedding and reranking should get data from the knowledge database of the model. Broken since v0.6.16 ### Actual Behavior Since v0.6.16, the hybrid search functionality (using an OpenAI embedding model and an OpenAI reranking model) has stopped working. In v0.6.17, I saw in the changelog: "Hybrid Search Functionality Restored". But it still doesn't work for me. Still broken in v0.6.18. If I roll back to v0.6.15, it works fine. ### Steps to Reproduce - Create a knowledge base - Create a model and add the knowledge base - Use the following RAG parameters RAG_EMBEDDING_ENGINE: "openai" RAG_OPENAI_API_BASE_URL: "https://my-openai-url" RAG_OPENAI_API_KEY: "s3Cr3t" RAG_EMBEDDING_MODEL: "nomic" CONTENT_EXTRACTION_ENGINE: "tika" TIKA_SERVER_URL: "https://my-tika-url" CHUNK_SIZE: "800" RAG_EMBEDDING_OPENAI_BATCH_SIZE: "64" RAG_TOP_K: "12" RAG_TOP_K_RERANKER: "6" RAG_RELEVANCE_THRESHOLD: "0.0" RAG_TEXT_SPLITTER: "character" RAG_FILE_MAX_SIZE: "1000" RAG_FILE_MAX_COUNT: "10" ENABLE_RAG_HYBRID_SEARCH: "true" RAG_RERANKING_ENGINE: "external" RAG_RERANKING_MODEL: "bge-reranker" RAG_EXTERNAL_RERANKER_URL: "https://my-openai-url/v1/rerank" RAG_EXTERNAL_RERANKER_API_KEY: "s3cr3t" - Ask a question to the model ### Logs & Screenshots I can see in the logs : 2025-07-21 08:55:49.781 | INFO | httpx._client:_send_single_request:1025 - HTTP Request: GET http://192.168.0.10:6333/collections/open-webui_knowledge/exists "HTTP/1.1 200 OK" - {} 2025-07-21 08:55:49.782 | INFO | open_webui.retrieval.utils:query_collection_with_hybrid_search:352 - Starting hybrid search for 1 query in 1 collection... - {} 2025-07-21 08:55:49.839 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 130.79.202.31:0 - "POST /api/chat/completions HTTP/1.1" 200 - {} But no documents are retrieved. ### Additional Information _No response_
GiteaMirror added the bug label 2025-11-11 16:35:02 -06:00
Author
Owner

@tjbck commented on GitHub (Jul 21, 2025):

We're unable to reproduce, could you share more backend logs with GLOBAL_LOG_LEVEL set to debug?

@tjbck commented on GitHub (Jul 21, 2025): We're unable to reproduce, could you share more backend logs with `GLOBAL_LOG_LEVEL` set to `debug`?
Author
Owner

@dotmobo commented on GitHub (Jul 21, 2025):

Sûre

2025-07-21 09:24:40.961 | DEBUG    | open_webui.utils.middleware:process_chat_payload:903 - tool_servers=[] - {}
2025-07-21 09:24:40.961 | DEBUG    | open_webui.routers.tasks:generate_queries:518 - generating retrieval queries using model qwen2.5-mini for user morgan@noreply.fr - {}
2025-07-21 09:24:40.962 | DEBUG    | open_webui.utils.chat:generate_chat_completion:167 - generate_chat_completion: {'model': 'qwen2.5-mini', 'messages': [{'role': 'user', 'content': '###
2025-07-21 09:24:40.963 | DEBUG    | aiocache.base:get:201 - GET open_webui.routers.openaiget_all_models(<starlette.requests.Request object at 0x7f6dab939e50>,)[('user', UserModel(id='3744
2025-07-21 09:24:40.963 | INFO     | open_webui.routers.openai:get_all_models:392 - get_all_models() - {}
2025-07-21 09:24:40.973 | DEBUG    | open_webui.routers.openai:get_all_models_responses:373 - get_all_models:responses() [{'data': [{'id': 'bge-reranker', 'object': 'model', 'created': 167
2025-07-21 09:24:40.973 | DEBUG    | open_webui.routers.openai:merge_models_lists:407 - merge_models_lists <map object at 0x7f6dab960c40> - {}
2025-07-21 09:24:40.973 | DEBUG    | open_webui.routers.openai:get_all_models:446 - models: {'data': [{'id': 'bge-reranker', 'object': 'model', 'created': 1677610602, 'owned_by': 'openai',
2025-07-21 09:24:40.974 | DEBUG    | aiocache.base:set:280 - SET open_webui.routers.openaiget_all_models(<starlette.requests.Request object at 0x7f6dab939e50>,)[('user', UserModel(id='3744
2025-07-21 09:24:41.115 | DEBUG    | open_webui.retrieval.utils:get_sources_from_items:473 - items: [{'id': '6af4e414-0a4b-40c1-8ea7-6b6c5b769692', 'user_id': '37441ea6-7a20-40ed-8336-07ff
2025-07-21 09:24:41.115 | DEBUG    | open_webui.retrieval.utils:query_collection_with_hybrid_search:342 - query_collection_with_hybrid_search:VECTOR_DB_CLIENT.get:collection 6af4e414-0a4b-
2025-07-21 09:24:41.116 | DEBUG    | httpcore._trace:trace:47 - connect_tcp.started host='192.168.0.10' port=6333 local_address=None timeout=5.0 socket_options=None - {}
2025-07-21 09:24:41.116 | DEBUG    | httpcore._trace:trace:47 - connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 0x7f6dab967e10> - {}
2025-07-21 09:24:41.116 | DEBUG    | httpcore._trace:trace:47 - send_request_headers.started request=<Request [b'GET']> - {}
2025-07-21 09:24:41.117 | DEBUG    | httpcore._trace:trace:47 - send_request_headers.complete - {}
2025-07-21 09:24:41.117 | DEBUG    | httpcore._trace:trace:47 - send_request_body.started request=<Request [b'GET']> - {}
2025-07-21 09:24:41.117 | DEBUG    | httpcore._trace:trace:47 - send_request_body.complete - {}
2025-07-21 09:24:41.117 | DEBUG    | httpcore._trace:trace:47 - receive_response_headers.started request=<Request [b'GET']> - {}
2025-07-21 09:24:41.117 | DEBUG    | httpcore._trace:trace:47 - receive_response_headers.complete return_value=(b'HTTP/1.1', 200, b'OK', [(b'transfer-encoding', b'chunked'), (b'content-typ
2025-07-21 09:24:41.117 | INFO     | httpx._client:_send_single_request:1025 - HTTP Request: GET http://192.168.0.10:6333/collections/open-webui_knowledge/exists "HTTP/1.1 200 OK" - {}
2025-07-21 09:24:41.118 | DEBUG    | httpcore._trace:trace:47 - receive_response_body.started request=<Request [b'GET']> - {}
2025-07-21 09:24:41.118 | DEBUG    | httpcore._trace:trace:47 - receive_response_body.complete - {}
2025-07-21 09:24:41.118 | DEBUG    | httpcore._trace:trace:47 - response_closed.started - {}
2025-07-21 09:24:41.118 | DEBUG    | httpcore._trace:trace:47 - response_closed.complete - {}
2025-07-21 09:24:41.119 | DEBUG    | open_webui.retrieval.vector.dbs.qdrant_multitenancy:get:296 - Collection open-webui_knowledge doesn't exist, get returns None - {}
2025-07-21 09:24:41.119 | INFO     | open_webui.retrieval.utils:query_collection_with_hybrid_search:352 - Starting hybrid search for 1 queries in 1 collections... - {}
2025-07-21 09:24:41.119 | DEBUG    | open_webui.utils.middleware:chat_completion_files_handler:675 - rag_contexts:sources: [{'source': {'id': '6af4e414-0a4b-40c1-8ea7-6b6c5b769692', 'user_
2025-07-21 09:24:41.119 | DEBUG    | open_webui.utils.middleware:process_chat_payload:991 - With a 0 relevancy threshold for RAG, the context cannot be empty - {}
2025-07-21 09:24:41.126 | DEBUG    | open_webui.utils.chat:generate_chat_completion:167 - generate_chat_completion: {'stream': True, 'model': 'test-rust', 'messages': [{'role': 'user', 'co '37441ea6-7a20-40ed-8336-07ff74a79107', 'name': 'Rust', 'description': 'Test', 'meta': None, 'access_control': {'read': {'group_ids': [], 'user_ids': []}, 'write': {'group_ids': [], 'user_ids': []}}, 'created_at': 1753086912, 'updated_at': 1753087343, 'user': {'id': '37441ea6-7a20-40ed-8336-07ff74a79107', 'name': 'Morgan', 'email': 'morgan@noreply.fr', 'role': 'admin', 'profile_image_url': '/user.png'}, 'files': [{'id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'meta': {'name': 'rust_book.pdf', 'content_type': 'application/pdf', 'size': 3836853, 'data': {}, 'collection_name': '6af4e414-0a4b-40c1-8ea7-6b6c5b769692'}, 'created_at': 1753087325, 'updated_at': 1753087325}], 'type': 'collection'}], 'features': {'image_generation': False, 'code_interpreter': False, 'web_search': False, 'memory': False}, 'variables': {'{{USER_NAME}}': 'Morgan', '{{USER_LOCATION}}': 'Unknown', '{{CURRENT_DATETIME}}': '2025-07-21 11:24:40', '{{CURRENT_DATE}}': '2025-07-21', '{{CURRENT_TIME}}': '11:24:40', '{{CURRENT_WEEKDAY}}': 'Monday', '{{CURRENT_TIMEZONE}}': 'Europe/Paris', '{{USER_LANGUAGE}}': 'fr-FR'}, 'model': {'id': 'test-rust', 'name': 'Test rust', 'object': 'model', 'created': 1753087367, 'owned_by': 'openai', 'info': {'id': 'test-rust', 'user_id': '37441ea6-7a20-40ed-8336-07ff74a79107', 'base_model_id': 'qwen3', 'name': 'Test rust', 'params': {'system': '/no_think'}, 'meta': {'profile_image_url': '/static/favicon.png', 'description': None, 'capabilities': {'vision': True, 'file_upload': True, 'web_search': True, 'image_generation': True, 'code_interpreter': True, 'citations': True, 'usage': False}, 'suggestion_prompts': None, 'tags': [], 'knowledge': [{'id': '6af4e414-0a4b-40c1-8ea7-6b6c5b769692', 'user_id': '37441ea6-7a20-40ed-8336-07ff74a79107', 'name': 'Rust', 'description': 'Test', 'meta': None, 'access_control': {'read': {'group_ids': [], 'user_ids': []}, 'write': {'group_ids': [], 'user_ids': []}}, 'created_at': 1753086912, 'updated_at': 1753087343, 'user': {'id': '37441ea6-7a20-40ed-8336-07ff74a79107', 'name': 'Morgan', 'email': 'morgan@noreply.fr', 'role': 'admin', 'profile_image_url': '/user.png'}, 'files': [{'id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'meta': {'name': 'rust_book.pdf', 'content_type': 'application/pdf', 'size': 3836853, 'data': {}, 'collection_name': '6af4e414-0a4b-40c1-8ea7-6b6c5b769692'}, 'created_at': 1753087325, 'updated_at': 1753087325}], 'type': 'collection'}]}, 'access_control': {'read': {'group_ids': [], 'user_ids': []}, 'write': {'group_ids': [], 'user_ids': []}}, 'is_active': True, 'updated_at': 1753087367, 'created_at': 1753087367}, 'preset': True, 'actions': [], 'filters': [], 'tags': []}, 'direct': False}} - {}
@dotmobo commented on GitHub (Jul 21, 2025): Sûre ``` 2025-07-21 09:24:40.961 | DEBUG | open_webui.utils.middleware:process_chat_payload:903 - tool_servers=[] - {} 2025-07-21 09:24:40.961 | DEBUG | open_webui.routers.tasks:generate_queries:518 - generating retrieval queries using model qwen2.5-mini for user morgan@noreply.fr - {} 2025-07-21 09:24:40.962 | DEBUG | open_webui.utils.chat:generate_chat_completion:167 - generate_chat_completion: {'model': 'qwen2.5-mini', 'messages': [{'role': 'user', 'content': '### 2025-07-21 09:24:40.963 | DEBUG | aiocache.base:get:201 - GET open_webui.routers.openaiget_all_models(<starlette.requests.Request object at 0x7f6dab939e50>,)[('user', UserModel(id='3744 2025-07-21 09:24:40.963 | INFO | open_webui.routers.openai:get_all_models:392 - get_all_models() - {} 2025-07-21 09:24:40.973 | DEBUG | open_webui.routers.openai:get_all_models_responses:373 - get_all_models:responses() [{'data': [{'id': 'bge-reranker', 'object': 'model', 'created': 167 2025-07-21 09:24:40.973 | DEBUG | open_webui.routers.openai:merge_models_lists:407 - merge_models_lists <map object at 0x7f6dab960c40> - {} 2025-07-21 09:24:40.973 | DEBUG | open_webui.routers.openai:get_all_models:446 - models: {'data': [{'id': 'bge-reranker', 'object': 'model', 'created': 1677610602, 'owned_by': 'openai', 2025-07-21 09:24:40.974 | DEBUG | aiocache.base:set:280 - SET open_webui.routers.openaiget_all_models(<starlette.requests.Request object at 0x7f6dab939e50>,)[('user', UserModel(id='3744 2025-07-21 09:24:41.115 | DEBUG | open_webui.retrieval.utils:get_sources_from_items:473 - items: [{'id': '6af4e414-0a4b-40c1-8ea7-6b6c5b769692', 'user_id': '37441ea6-7a20-40ed-8336-07ff 2025-07-21 09:24:41.115 | DEBUG | open_webui.retrieval.utils:query_collection_with_hybrid_search:342 - query_collection_with_hybrid_search:VECTOR_DB_CLIENT.get:collection 6af4e414-0a4b- 2025-07-21 09:24:41.116 | DEBUG | httpcore._trace:trace:47 - connect_tcp.started host='192.168.0.10' port=6333 local_address=None timeout=5.0 socket_options=None - {} 2025-07-21 09:24:41.116 | DEBUG | httpcore._trace:trace:47 - connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 0x7f6dab967e10> - {} 2025-07-21 09:24:41.116 | DEBUG | httpcore._trace:trace:47 - send_request_headers.started request=<Request [b'GET']> - {} 2025-07-21 09:24:41.117 | DEBUG | httpcore._trace:trace:47 - send_request_headers.complete - {} 2025-07-21 09:24:41.117 | DEBUG | httpcore._trace:trace:47 - send_request_body.started request=<Request [b'GET']> - {} 2025-07-21 09:24:41.117 | DEBUG | httpcore._trace:trace:47 - send_request_body.complete - {} 2025-07-21 09:24:41.117 | DEBUG | httpcore._trace:trace:47 - receive_response_headers.started request=<Request [b'GET']> - {} 2025-07-21 09:24:41.117 | DEBUG | httpcore._trace:trace:47 - receive_response_headers.complete return_value=(b'HTTP/1.1', 200, b'OK', [(b'transfer-encoding', b'chunked'), (b'content-typ 2025-07-21 09:24:41.117 | INFO | httpx._client:_send_single_request:1025 - HTTP Request: GET http://192.168.0.10:6333/collections/open-webui_knowledge/exists "HTTP/1.1 200 OK" - {} 2025-07-21 09:24:41.118 | DEBUG | httpcore._trace:trace:47 - receive_response_body.started request=<Request [b'GET']> - {} 2025-07-21 09:24:41.118 | DEBUG | httpcore._trace:trace:47 - receive_response_body.complete - {} 2025-07-21 09:24:41.118 | DEBUG | httpcore._trace:trace:47 - response_closed.started - {} 2025-07-21 09:24:41.118 | DEBUG | httpcore._trace:trace:47 - response_closed.complete - {} 2025-07-21 09:24:41.119 | DEBUG | open_webui.retrieval.vector.dbs.qdrant_multitenancy:get:296 - Collection open-webui_knowledge doesn't exist, get returns None - {} 2025-07-21 09:24:41.119 | INFO | open_webui.retrieval.utils:query_collection_with_hybrid_search:352 - Starting hybrid search for 1 queries in 1 collections... - {} 2025-07-21 09:24:41.119 | DEBUG | open_webui.utils.middleware:chat_completion_files_handler:675 - rag_contexts:sources: [{'source': {'id': '6af4e414-0a4b-40c1-8ea7-6b6c5b769692', 'user_ 2025-07-21 09:24:41.119 | DEBUG | open_webui.utils.middleware:process_chat_payload:991 - With a 0 relevancy threshold for RAG, the context cannot be empty - {} 2025-07-21 09:24:41.126 | DEBUG | open_webui.utils.chat:generate_chat_completion:167 - generate_chat_completion: {'stream': True, 'model': 'test-rust', 'messages': [{'role': 'user', 'co '37441ea6-7a20-40ed-8336-07ff74a79107', 'name': 'Rust', 'description': 'Test', 'meta': None, 'access_control': {'read': {'group_ids': [], 'user_ids': []}, 'write': {'group_ids': [], 'user_ids': []}}, 'created_at': 1753086912, 'updated_at': 1753087343, 'user': {'id': '37441ea6-7a20-40ed-8336-07ff74a79107', 'name': 'Morgan', 'email': 'morgan@noreply.fr', 'role': 'admin', 'profile_image_url': '/user.png'}, 'files': [{'id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'meta': {'name': 'rust_book.pdf', 'content_type': 'application/pdf', 'size': 3836853, 'data': {}, 'collection_name': '6af4e414-0a4b-40c1-8ea7-6b6c5b769692'}, 'created_at': 1753087325, 'updated_at': 1753087325}], 'type': 'collection'}], 'features': {'image_generation': False, 'code_interpreter': False, 'web_search': False, 'memory': False}, 'variables': {'{{USER_NAME}}': 'Morgan', '{{USER_LOCATION}}': 'Unknown', '{{CURRENT_DATETIME}}': '2025-07-21 11:24:40', '{{CURRENT_DATE}}': '2025-07-21', '{{CURRENT_TIME}}': '11:24:40', '{{CURRENT_WEEKDAY}}': 'Monday', '{{CURRENT_TIMEZONE}}': 'Europe/Paris', '{{USER_LANGUAGE}}': 'fr-FR'}, 'model': {'id': 'test-rust', 'name': 'Test rust', 'object': 'model', 'created': 1753087367, 'owned_by': 'openai', 'info': {'id': 'test-rust', 'user_id': '37441ea6-7a20-40ed-8336-07ff74a79107', 'base_model_id': 'qwen3', 'name': 'Test rust', 'params': {'system': '/no_think'}, 'meta': {'profile_image_url': '/static/favicon.png', 'description': None, 'capabilities': {'vision': True, 'file_upload': True, 'web_search': True, 'image_generation': True, 'code_interpreter': True, 'citations': True, 'usage': False}, 'suggestion_prompts': None, 'tags': [], 'knowledge': [{'id': '6af4e414-0a4b-40c1-8ea7-6b6c5b769692', 'user_id': '37441ea6-7a20-40ed-8336-07ff74a79107', 'name': 'Rust', 'description': 'Test', 'meta': None, 'access_control': {'read': {'group_ids': [], 'user_ids': []}, 'write': {'group_ids': [], 'user_ids': []}}, 'created_at': 1753086912, 'updated_at': 1753087343, 'user': {'id': '37441ea6-7a20-40ed-8336-07ff74a79107', 'name': 'Morgan', 'email': 'morgan@noreply.fr', 'role': 'admin', 'profile_image_url': '/user.png'}, 'files': [{'id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'meta': {'name': 'rust_book.pdf', 'content_type': 'application/pdf', 'size': 3836853, 'data': {}, 'collection_name': '6af4e414-0a4b-40c1-8ea7-6b6c5b769692'}, 'created_at': 1753087325, 'updated_at': 1753087325}], 'type': 'collection'}]}, 'access_control': {'read': {'group_ids': [], 'user_ids': []}, 'write': {'group_ids': [], 'user_ids': []}}, 'is_active': True, 'updated_at': 1753087367, 'created_at': 1753087367}, 'preset': True, 'actions': [], 'filters': [], 'tags': []}, 'direct': False}} - {} ```
Author
Owner

@dotmobo commented on GitHub (Jul 21, 2025):

The anwser in v0.6.18 :

Image

The same question but in v0.6.15 :

Image

You can see the mension of rust_book.pdf

@dotmobo commented on GitHub (Jul 21, 2025): The anwser in v0.6.18 : <img width="1165" height="601" alt="Image" src="https://github.com/user-attachments/assets/39458d8d-6d49-4402-b94c-7ec1d7a08fd2" /> The same question but in v0.6.15 : <img width="1165" height="601" alt="Image" src="https://github.com/user-attachments/assets/fa0b89bb-aa06-4e0d-8199-27730a6b2713" /> You can see the mension of rust_book.pdf
Author
Owner

@dotmobo commented on GitHub (Jul 21, 2025):

My config document screen :

Image

I use a LiteLLM gateway to serve embedding and reranking models with an openai api compatible

@dotmobo commented on GitHub (Jul 21, 2025): My config document screen : <img width="1635" height="921" alt="Image" src="https://github.com/user-attachments/assets/1f241dee-9285-46fc-bd46-d09ddd08c24d" /> I use a LiteLLM gateway to serve embedding and reranking models with an openai api compatible
Author
Owner

@dotmobo commented on GitHub (Jul 21, 2025):

The following lines are present in the v0.6.15 log and missing in the v0.6.18 log :

2025-07-21 09:29:32.465 | DEBUG    | open_webui.retrieval.utils:generate_openai_batch_embeddings:667 - generate_openai_batch_embeddings:model nomic batch size: 1 - {}
....
2025-07-21 09:29:32.507 | INFO     | open_webui.retrieval.models.external:predict:36 - ExternalReranker:predict:model bge-reranker - {}
...
2025-07-21 09:29:32.555 | INFO     | open_webui.retrieval.utils:query_doc_with_hybrid_search:181 - query_doc_with_hybrid_search:result [[{'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.3415825068950653}, {'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.2267836332321167}, {'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.14414885640144348}, {'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.1127954050898552}, {'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.10970578342676163}, {'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.09670579433441162}]] [[0.3415825068950653, 0.2267836332321167, 0.14414885640144348, 0.1127954050898552, 0.10970578342676163, 0.09670579433441162]] - {}
...
@dotmobo commented on GitHub (Jul 21, 2025): The following lines are present in the v0.6.15 log and missing in the v0.6.18 log : ``` 2025-07-21 09:29:32.465 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:667 - generate_openai_batch_embeddings:model nomic batch size: 1 - {} .... 2025-07-21 09:29:32.507 | INFO | open_webui.retrieval.models.external:predict:36 - ExternalReranker:predict:model bge-reranker - {} ... 2025-07-21 09:29:32.555 | INFO | open_webui.retrieval.utils:query_doc_with_hybrid_search:181 - query_doc_with_hybrid_search:result [[{'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.3415825068950653}, {'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.2267836332321167}, {'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.14414885640144348}, {'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.1127954050898552}, {'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.10970578342676163}, {'Content-Type': 'application/pdf', 'name': 'rust_book.pdf', 'created_by': '37441ea6-7a20-40ed-8336-07ff74a79107', 'file_id': '4d9bebfa-4a58-4a12-9587-349321aec2f1', 'source': 'rust_book.pdf', 'start_index': 0, 'hash': '1e6f21ec928a0ad5c77e773f3511434b9b95aa9cab72da398aaeaa7aa1500cfe', 'embedding_config': '{"engine": "openai", "model": "nomic"}', 'score': 0.09670579433441162}]] [[0.3415825068950653, 0.2267836332321167, 0.14414885640144348, 0.1127954050898552, 0.10970578342676163, 0.09670579433441162]] - {} ... ```
Author
Owner

@Abdelrahman1993 commented on GitHub (Jul 21, 2025):

I'm having the same problem since v0.6.16. Just upgraded to v0.6.18 and the issue is still there.

Open WebUi logs


2025-07-21 15:22:18.788 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/models HTTP/1.1" 200 - {"dd.trace_id": "687e5b250000000004826ff3a7b6bf2a", "dd.span_id": "1219164584246846588", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:23.910 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "POST /api/v1/chats/new HTTP/1.1" 200 - {"dd.trace_id": "687e5b2f000000001951367b158a451d", "dd.span_id": "9670324238172803639", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:24.045 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {"dd.trace_id": "687e5b30000000005778692fbd3e89e6", "dd.span_id": "5808409242413890064", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:24.185 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {"dd.trace_id": "687e5b3000000000054866ed6a1627a2", "dd.span_id": "4998135653675788540", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:24.226 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "POST /api/v1/chats/097c6707-cdb8-4daf-8601-dd545562e4c6 HTTP/1.1" 200 - {"dd.trace_id": "687e5b3000000000ee6a3d87e7aa6ce3", "dd.span_id": "2312236660076630617", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:24.344 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {"dd.trace_id": "687e5b3000000000e1c26cf326561b9b", "dd.span_id": "7668880088835838458", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:24.465 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {"dd.trace_id": "687e5b3000000000ac076c476a63760e", "dd.span_id": "1196482136707810870", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:25.833 | INFO     | httpx._client:_send_single_request:1025 - HTTP Request: GET http://beta-qdrant:6333/collections/open-webui_memories/exists "HTTP/1.1 200 OK" - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "17268840605536355462", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:25.883 | INFO     | open_webui.routers.openai:get_all_models:392 - get_all_models() - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "1754143943661600665", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:27.809 | INFO     | httpx._client:_send_single_request:1025 - HTTP Request: GET http://beta-qdrant:6333/collections/open-webui_knowledge/exists "HTTP/1.1 200 OK" - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "14078348081553326992", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:27.819 | INFO     | httpx._client:_send_single_request:1025 - HTTP Request: GET http://beta-qdrant:6333/collections/open-webui_knowledge/exists "HTTP/1.1 200 OK" - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "9021380118263690380", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:27.833 | INFO     | httpx._client:_send_single_request:1025 - HTTP Request: GET http://beta-qdrant:6333/collections/open-webui_knowledge/exists "HTTP/1.1 200 OK" - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "17035142705858211415", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:27.884 | INFO     | open_webui.routers.openai:get_all_models:392 - get_all_models() - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "1754143943661600665", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:29.286 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/097c6707-cdb8-4daf-8601-dd545562e4c6 HTTP/1.1" 200 - {"dd.trace_id": "687e5b350000000054d83000020ad598", "dd.span_id": "4568960867969541899", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:29.411 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "POST /api/chat/completions HTTP/1.1" 200 - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "1754143943661600665", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:29.574 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {"dd.trace_id": "687e5b35000000004fd43ffddd212c36", "dd.span_id": "922351395071495831", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:29.680 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {"dd.trace_id": "687e5b35000000008d9dc35c792b97e3", "dd.span_id": "8579733363339881751", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:38.785 | INFO     | open_webui.routers.openai:get_all_models:392 - get_all_models() - {"dd.trace_id": "0", "dd.span_id": "0", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:39.434 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "POST /api/chat/completed HTTP/1.1" 200 - {"dd.trace_id": "687e5b3e0000000016eaac162213781d", "dd.span_id": "4285429340995293785", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:39.975 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "POST /api/v1/chats/097c6707-cdb8-4daf-8601-dd545562e4c6 HTTP/1.1" 200 - {"dd.trace_id": "687e5b3f0000000084114ba386f6ae12", "dd.span_id": "495931353932802937", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:40.113 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {"dd.trace_id": "687e5b400000000038433d2b2d191001", "dd.span_id": "5691487724829672670", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:40.238 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {"dd.trace_id": "687e5b400000000018a91e2b937a0a06", "dd.span_id": "9334658167905857777", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:41.024 | INFO     | open_webui.routers.openai:get_all_models:392 - get_all_models() - {"dd.trace_id": "0", "dd.span_id": "0", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:41.114 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {"dd.trace_id": "687e5b4100000000b2179585cc8ed54c", "dd.span_id": "1790086560446877485", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:41.235 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {"dd.trace_id": "687e5b4100000000c127096435699106", "dd.span_id": "5077320422372030479", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:43.273 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/097c6707-cdb8-4daf-8601-dd545562e4c6 HTTP/1.1" 200 - {"dd.trace_id": "687e5b4300000000b66884a1a806b457", "dd.span_id": "1986476246681890740", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:43.385 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/all/tags HTTP/1.1" 200 - {"dd.trace_id": "687e5b4300000000f5a8965d8e190e0c", "dd.span_id": "1631592220646372169", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}
2025-07-21 15:22:46.450 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /_app/version.json HTTP/1.1" 200 - {"dd.trace_id": "687e5b4600000000fef97b954290df55", "dd.span_id": "4032254470531713533", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"}

Qdrant DB Logs

2025-07-21T15:22:25.832105Z  INFO actix_web::middleware::logger: 10.200.76.33 "GET /collections/open-webui_memories/exists HTTP/1.1" 200 57 "-" "qdrant-client/1.14.3 python/3.11.13" 0.000324    
2025-07-21T15:22:27.807063Z  INFO actix_web::middleware::logger: 10.200.76.33 "GET /collections/open-webui_knowledge/exists HTTP/1.1" 200 57 "-" "qdrant-client/1.14.3 python/3.11.13" 0.000663    
2025-07-21T15:22:27.815008Z  INFO actix_web::middleware::logger: 10.200.76.33 "GET /collections/open-webui_knowledge/exists HTTP/1.1" 200 58 "-" "qdrant-client/1.14.3 python/3.11.13" 0.000676    
2025-07-21T15:22:27.829070Z  INFO actix_web::middleware::logger: 10.200.76.33 "GET /collections/open-webui_knowledge/exists HTTP/1.1" 200 57 "-" "qdrant-client/1.14.3 python/3.11.13" 0.000315  

@Abdelrahman1993 commented on GitHub (Jul 21, 2025): I'm having the same problem since v0.6.16. Just upgraded to v0.6.18 and the issue is still there. Open WebUi logs ``` 2025-07-21 15:22:18.788 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/models HTTP/1.1" 200 - {"dd.trace_id": "687e5b250000000004826ff3a7b6bf2a", "dd.span_id": "1219164584246846588", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:23.910 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "POST /api/v1/chats/new HTTP/1.1" 200 - {"dd.trace_id": "687e5b2f000000001951367b158a451d", "dd.span_id": "9670324238172803639", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:24.045 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {"dd.trace_id": "687e5b30000000005778692fbd3e89e6", "dd.span_id": "5808409242413890064", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:24.185 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {"dd.trace_id": "687e5b3000000000054866ed6a1627a2", "dd.span_id": "4998135653675788540", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:24.226 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "POST /api/v1/chats/097c6707-cdb8-4daf-8601-dd545562e4c6 HTTP/1.1" 200 - {"dd.trace_id": "687e5b3000000000ee6a3d87e7aa6ce3", "dd.span_id": "2312236660076630617", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:24.344 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {"dd.trace_id": "687e5b3000000000e1c26cf326561b9b", "dd.span_id": "7668880088835838458", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:24.465 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {"dd.trace_id": "687e5b3000000000ac076c476a63760e", "dd.span_id": "1196482136707810870", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:25.833 | INFO | httpx._client:_send_single_request:1025 - HTTP Request: GET http://beta-qdrant:6333/collections/open-webui_memories/exists "HTTP/1.1 200 OK" - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "17268840605536355462", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:25.883 | INFO | open_webui.routers.openai:get_all_models:392 - get_all_models() - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "1754143943661600665", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:27.809 | INFO | httpx._client:_send_single_request:1025 - HTTP Request: GET http://beta-qdrant:6333/collections/open-webui_knowledge/exists "HTTP/1.1 200 OK" - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "14078348081553326992", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:27.819 | INFO | httpx._client:_send_single_request:1025 - HTTP Request: GET http://beta-qdrant:6333/collections/open-webui_knowledge/exists "HTTP/1.1 200 OK" - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "9021380118263690380", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:27.833 | INFO | httpx._client:_send_single_request:1025 - HTTP Request: GET http://beta-qdrant:6333/collections/open-webui_knowledge/exists "HTTP/1.1 200 OK" - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "17035142705858211415", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:27.884 | INFO | open_webui.routers.openai:get_all_models:392 - get_all_models() - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "1754143943661600665", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:29.286 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/097c6707-cdb8-4daf-8601-dd545562e4c6 HTTP/1.1" 200 - {"dd.trace_id": "687e5b350000000054d83000020ad598", "dd.span_id": "4568960867969541899", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:29.411 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "POST /api/chat/completions HTTP/1.1" 200 - {"dd.trace_id": "687e5b3000000000f204b40482f32bfd", "dd.span_id": "1754143943661600665", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:29.574 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {"dd.trace_id": "687e5b35000000004fd43ffddd212c36", "dd.span_id": "922351395071495831", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:29.680 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {"dd.trace_id": "687e5b35000000008d9dc35c792b97e3", "dd.span_id": "8579733363339881751", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:38.785 | INFO | open_webui.routers.openai:get_all_models:392 - get_all_models() - {"dd.trace_id": "0", "dd.span_id": "0", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:39.434 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "POST /api/chat/completed HTTP/1.1" 200 - {"dd.trace_id": "687e5b3e0000000016eaac162213781d", "dd.span_id": "4285429340995293785", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:39.975 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "POST /api/v1/chats/097c6707-cdb8-4daf-8601-dd545562e4c6 HTTP/1.1" 200 - {"dd.trace_id": "687e5b3f0000000084114ba386f6ae12", "dd.span_id": "495931353932802937", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:40.113 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {"dd.trace_id": "687e5b400000000038433d2b2d191001", "dd.span_id": "5691487724829672670", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:40.238 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {"dd.trace_id": "687e5b400000000018a91e2b937a0a06", "dd.span_id": "9334658167905857777", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:41.024 | INFO | open_webui.routers.openai:get_all_models:392 - get_all_models() - {"dd.trace_id": "0", "dd.span_id": "0", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:41.114 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 - {"dd.trace_id": "687e5b4100000000b2179585cc8ed54c", "dd.span_id": "1790086560446877485", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:41.235 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/folders/ HTTP/1.1" 200 - {"dd.trace_id": "687e5b4100000000c127096435699106", "dd.span_id": "5077320422372030479", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:43.273 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/097c6707-cdb8-4daf-8601-dd545562e4c6 HTTP/1.1" 200 - {"dd.trace_id": "687e5b4300000000b66884a1a806b457", "dd.span_id": "1986476246681890740", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:43.385 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /api/v1/chats/all/tags HTTP/1.1" 200 - {"dd.trace_id": "687e5b4300000000f5a8965d8e190e0c", "dd.span_id": "1631592220646372169", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} 2025-07-21 15:22:46.450 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 10.6.12.13:0 - "GET /_app/version.json HTTP/1.1" 200 - {"dd.trace_id": "687e5b4600000000fef97b954290df55", "dd.span_id": "4032254470531713533", "dd.service": "open-webui-deployment", "dd.version": "", "dd.env": "beta"} ``` Qdrant DB Logs ``` 2025-07-21T15:22:25.832105Z INFO actix_web::middleware::logger: 10.200.76.33 "GET /collections/open-webui_memories/exists HTTP/1.1" 200 57 "-" "qdrant-client/1.14.3 python/3.11.13" 0.000324 2025-07-21T15:22:27.807063Z INFO actix_web::middleware::logger: 10.200.76.33 "GET /collections/open-webui_knowledge/exists HTTP/1.1" 200 57 "-" "qdrant-client/1.14.3 python/3.11.13" 0.000663 2025-07-21T15:22:27.815008Z INFO actix_web::middleware::logger: 10.200.76.33 "GET /collections/open-webui_knowledge/exists HTTP/1.1" 200 58 "-" "qdrant-client/1.14.3 python/3.11.13" 0.000676 2025-07-21T15:22:27.829070Z INFO actix_web::middleware::logger: 10.200.76.33 "GET /collections/open-webui_knowledge/exists HTTP/1.1" 200 57 "-" "qdrant-client/1.14.3 python/3.11.13" 0.000315 ```
Author
Owner

@tjbck commented on GitHub (Jul 21, 2025):

@dotmobo are you using qdrant as well?

@tjbck commented on GitHub (Jul 21, 2025): @dotmobo are you using `qdrant` as well?
Author
Owner

@dotmobo commented on GitHub (Jul 21, 2025):

Yes i use qdrant

@dotmobo commented on GitHub (Jul 21, 2025): Yes i use qdrant
Author
Owner

@cyberclaw03 commented on GitHub (Jul 23, 2025):

Since upgrading to v0.6.18 from v0.6.14, I am having this very issue. I am also using qdrant. Prior to the upgrade, RAG worked perfectly.

@cyberclaw03 commented on GitHub (Jul 23, 2025): Since upgrading to v0.6.18 from v0.6.14, I am having this very issue. I am also using qdrant. Prior to the upgrade, RAG worked perfectly.
Author
Owner

@cyberclaw03 commented on GitHub (Jul 23, 2025):

I managed to fix it on my end by deleting and recreating my Knowledge Bases. I also set the "ENABLE_QDRANT_MULTITENANCY_MODE" environment variable to "true," which may have helped.

At that point, though, I ran into a totally different issue. Tika was no longer working because of a recent Docker image update. I rolled back to an older version of Ticka and now everything works great.

@cyberclaw03 commented on GitHub (Jul 23, 2025): I managed to fix it on my end by deleting and recreating my Knowledge Bases. I also set the "ENABLE_QDRANT_MULTITENANCY_MODE" environment variable to "true," which may have helped. At that point, though, I ran into a totally different issue. Tika was no longer working because of a recent Docker image update. I rolled back to an older version of Ticka and now everything works great.
Author
Owner

@MacJedi42 commented on GitHub (Jul 24, 2025):

Have the same issue since upgrading to 0.6.18.

@MacJedi42 commented on GitHub (Jul 24, 2025): Have the same issue since upgrading to 0.6.18.
Author
Owner

@dotmobo commented on GitHub (Jul 24, 2025):

Thanks for the tips @cyberclaw03.
I clicked on the "Reindex all existing documents" button in the administrator page and it works again with the 0.6.18 @tjbck
.

@dotmobo commented on GitHub (Jul 24, 2025): Thanks for the tips @cyberclaw03. I clicked on the "Reindex all existing documents" button in the administrator page and it works again with the 0.6.18 @tjbck .
Author
Owner

@alexdjachenko commented on GitHub (Jul 24, 2025):

I have the same issue: bare KB works well, but in hybrid mode no one documents passed to LLM at all (not KB, not keywords). 0.6.18.
P.S I use internal Sintence Transformer

@alexdjachenko commented on GitHub (Jul 24, 2025): I have the same issue: bare KB works well, but in hybrid mode no one documents passed to LLM at all (not KB, not keywords). 0.6.18. P.S I use internal Sintence Transformer
Author
Owner

@MarceloCogo commented on GitHub (Jul 24, 2025):

I have the same issue using Hybrid Search. 0.6.18

@MarceloCogo commented on GitHub (Jul 24, 2025): I have the same issue using Hybrid Search. 0.6.18
Author
Owner

@tjbck commented on GitHub (Jul 24, 2025):

Most likely related to: https://github.com/open-webui/open-webui/pull/15289

@Anush008 could you chime in?

@tjbck commented on GitHub (Jul 24, 2025): Most likely related to: https://github.com/open-webui/open-webui/pull/15289 @Anush008 could you chime in?
Author
Owner

@Anush008 commented on GitHub (Jul 24, 2025):

I clicked on the "Reindex all existing documents" button in the administrator page and it works again with the 0.6.18

Hey all. Does this resolve the issue for you?

@Anush008 commented on GitHub (Jul 24, 2025): > I clicked on the "Reindex all existing documents" button in the administrator page and it works again with the 0.6.18 Hey all. Does this resolve the issue for you?
Author
Owner

@alexdjachenko commented on GitHub (Jul 24, 2025):

I clicked on the "Reindex all existing documents" button in the administrator page and it works again with the 0.6.18

Hey all. Does this resolve the issue for you?

Nop, I use intertal sintence transformet, not Qdrant.
KB works without reindexing. It seems, the trouble in a mixing results.

2025-07-24 16:57:21.398 | DEBUG | open_webui.models.tags:delete_tag_by_name_and_user_id:101 - res: 1 - {}

2025-07-24 16:57:21.447 | INFO | open_webui.models.chats:count_chats_by_tag_name_and_user_id:871 - Count of chats for tag 'user_interface_customization': 0 - {}

2025-07-24 16:57:21.449 | DEBUG | open_webui.models.tags:delete_tag_by_name_and_user_id:101 - res: 1 - {}

2025-07-24 16:57:21.496 | INFO | open_webui.models.chats:count_chats_by_tag_name_and_user_id:871 - Count of chats for tag 'technology': 15 - {}

2025-07-24 16:57:21.539 | INFO | open_webui.models.chats:count_chats_by_tag_name_and_user_id:871 - Count of chats for tag 'assistive_technologies': 0 - {}

2025-07-24 16:57:21.540 | DEBUG | open_webui.models.tags:delete_tag_by_name_and_user_id:101 - res: 1 - {}

2025-07-24 16:57:21.587 | INFO | open_webui.models.chats:count_chats_by_tag_name_and_user_id:871 - Count of chats for tag 'web_accessibility': 1 - {}

2025-07-24 16:57:21.629 | INFO | open_webui.models.chats:count_chats_by_tag_name_and_user_id:871 - Count of chats for tag 'education': 8 - {}

@alexdjachenko commented on GitHub (Jul 24, 2025): > > I clicked on the "Reindex all existing documents" button in the administrator page and it works again with the 0.6.18 > > Hey all. Does this resolve the issue for you? Nop, I use intertal sintence transformet, not Qdrant. KB works without reindexing. It seems, the trouble in a mixing results. 2025-07-24 16:57:21.398 | DEBUG | open_webui.models.tags:delete_tag_by_name_and_user_id:101 - res: 1 - {} 2025-07-24 16:57:21.447 | INFO | open_webui.models.chats:count_chats_by_tag_name_and_user_id:871 - Count of chats for tag 'user_interface_customization': 0 - {} 2025-07-24 16:57:21.449 | DEBUG | open_webui.models.tags:delete_tag_by_name_and_user_id:101 - res: 1 - {} 2025-07-24 16:57:21.496 | INFO | open_webui.models.chats:count_chats_by_tag_name_and_user_id:871 - Count of chats for tag 'technology': 15 - {} 2025-07-24 16:57:21.539 | INFO | open_webui.models.chats:count_chats_by_tag_name_and_user_id:871 - Count of chats for tag 'assistive_technologies': 0 - {} 2025-07-24 16:57:21.540 | DEBUG | open_webui.models.tags:delete_tag_by_name_and_user_id:101 - res: 1 - {} 2025-07-24 16:57:21.587 | INFO | open_webui.models.chats:count_chats_by_tag_name_and_user_id:871 - Count of chats for tag 'web_accessibility': 1 - {} 2025-07-24 16:57:21.629 | INFO | open_webui.models.chats:count_chats_by_tag_name_and_user_id:871 - Count of chats for tag 'education': 8 - {}
Author
Owner

@sendmebits commented on GitHub (Jul 24, 2025):

I'm seeing this as well. Here are the specifics:

  • It's working on multiple instances on v0.6.15 and breaks as of v0.6.16
  • v0.6.18 does not resolve (I've rolled everything back to v0.6.15 to have it working)
  • It is NOT related to Qdrant, I'm not using that
  • I am using Tika, unsure if this is a factor
  • The file does get uploaded and processed by Tika just fine, you can click on it and view it prior to submitting the question. Using the whole doc also works just fine - it's specifically the RAG/focused retrieval that fails
  • I am using 'OpenAI' for embedding model engine and 'External' for Reranking engine. Both are pointing at LiteLLM.

Below is the log line I see when it happens:

rid_search:352 - Starting hybrid search for 2 queries in 1 collections... - {}
open-webui  | 2025-07-24 17:59:07.525 | ERROR    | open_webui.retrieval.utils:query_doc_with_hybrid_search:191 - Error querying doc file-8279ba57-8316-4703-bec9-eed382b62736 with hybrid search: division by zero - {}
open-webui  | Traceback (most recent call last):
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
open-webui  |     self._bootstrap_inner()
open-webui  |     │    └ <function Thread._bootstrap_inner at 0x76137683c9a0>
open-webui  |     └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
open-webui  |     self.run()
open-webui  |     │    └ <function Thread.run at 0x76137683c680>
open-webui  |     └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 982, in run
open-webui  |     self._target(*self._args, **self._kwargs)
open-webui  |     │    │        │    │        │    └ {}
open-webui  |     │    │        │    │        └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)>
open-webui  |     │    │        │    └ (<weakref at 0x7613379466b0; to 'ThreadPoolExecutor' at 0x761338d40250>, <_queue.SimpleQueue object at 0x7613379645e0>, None,...
open-webui  |     │    │        └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)>
open-webui  |     │    └ <function _worker at 0x7613758feb60>
open-webui  |     └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
open-webui  |     work_item.run()
open-webui  |     │         └ <function _WorkItem.run at 0x7613758feca0>
open-webui  |     └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
open-webui  |     result = self.fn(*self.args, **self.kwargs)
open-webui  |              │    │   │    │       │    └ {}
open-webui  |              │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50>
open-webui  |              │    │   │    └ ('file-8279ba57-8316-4703-bec9-eed382b62736', 'how to view contents of a file')
open-webui  |              │    │   └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50>
open-webui  |              │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x761337db0f40>
open-webui  |              └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50>
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
open-webui  |     result = query_doc_with_hybrid_search(
open-webui  |              └ <function query_doc_with_hybrid_search at 0x76133d91c220>
open-webui  | 
open-webui  | > File "/app/backend/open_webui/retrieval/utils.py", line 128, in query_doc_with_hybrid_search
open-webui  |     bm25_retriever = BM25Retriever.from_texts(
open-webui  |                      │             └ <classmethod(<function BM25Retriever.from_texts at 0x76133dadf420>)>
open-webui  |                      └ <class 'langchain_community.retrievers.bm25.BM25Retriever'>
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain_community/retrievers/bm25.py", line 64, in from_texts
open-webui  |     vectorizer = BM25Okapi(texts_processed, **bm25_params)
open-webui  |                  │         │                  └ {}
open-webui  |                  │         └ []
open-webui  |                  └ <class 'rank_bm25.BM25Okapi'>
open-webui  |   File "/usr/local/lib/python3.11/site-packages/rank_bm25.py", line 83, in __init__
open-webui  |     super().__init__(corpus, tokenizer)
open-webui  |                      │       └ None
open-webui  |                      └ []
open-webui  |   File "/usr/local/lib/python3.11/site-packages/rank_bm25.py", line 27, in __init__
open-webui  |     nd = self._initialize(corpus)
open-webui  |          │    │           └ []
open-webui  |          │    └ <function BM25._initialize at 0x7613389a7380>
open-webui  |          └ <rank_bm25.BM25Okapi object at 0x7613391a7310>
open-webui  |   File "/usr/local/lib/python3.11/site-packages/rank_bm25.py", line 52, in _initialize
open-webui  |     self.avgdl = num_doc / self.corpus_size
open-webui  |     │    │       │         │    └ 0
open-webui  |     │    │       │         └ <rank_bm25.BM25Okapi object at 0x7613391a7310>
open-webui  |     │    │       └ 0
open-webui  |     │    └ 0
open-webui  |     └ <rank_bm25.BM25Okapi object at 0x7613391a7310>
open-webui  | 
open-webui  | ZeroDivisionError: division by zero
open-webui  | 2025-07-24 17:59:07.527 | ERROR    | open_webui.retrieval.utils:process_query:371 - Error when querying the collection with hybrid_search: division by zero - {}
open-webui  | Traceback (most recent call last):
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
open-webui  |     self._bootstrap_inner()
open-webui  |     │    └ <function Thread._bootstrap_inner at 0x76137683c9a0>
open-webui  |     └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
open-webui  |     self.run()
open-webui  |     │    └ <function Thread.run at 0x76137683c680>
open-webui  |     └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 982, in run
open-webui  |     self._target(*self._args, **self._kwargs)
open-webui  |     │    │        │    │        │    └ {}
open-webui  |     │    │        │    │        └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)>
open-webui  |     │    │        │    └ (<weakref at 0x7613379466b0; to 'ThreadPoolExecutor' at 0x761338d40250>, <_queue.SimpleQueue object at 0x7613379645e0>, None,...
open-webui  |     │    │        └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)>
open-webui  |     │    └ <function _worker at 0x7613758feb60>
open-webui  |     └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
open-webui  |     work_item.run()
open-webui  |     │         └ <function _WorkItem.run at 0x7613758feca0>
open-webui  |     └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
open-webui  |     result = self.fn(*self.args, **self.kwargs)
open-webui  |              │    │   │    │       │    └ {}
open-webui  |              │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50>
open-webui  |              │    │   │    └ ('file-8279ba57-8316-4703-bec9-eed382b62736', 'how to view contents of a file')
open-webui  |              │    │   └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50>
open-webui  |              │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x761337db0f40>
open-webui  |              └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50>
open-webui  | 
open-webui  | > File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
open-webui  |     result = query_doc_with_hybrid_search(
open-webui  |              └ <function query_doc_with_hybrid_search at 0x76133d91c220>
@sendmebits commented on GitHub (Jul 24, 2025): I'm seeing this as well. Here are the specifics: - It's working on multiple instances on v0.6.15 and **breaks as of v0.6.16** - v0.6.18 does not resolve (I've rolled everything back to v0.6.15 to have it working) - It is **NOT** related to **Qdrant**, I'm not using that - I am using Tika, unsure if this is a factor - The file does get uploaded and processed by Tika just fine, you can click on it and view it prior to submitting the question. Using the whole doc also works just fine - it's specifically the RAG/focused retrieval that fails - I am using '**OpenAI**' for embedding model engine and '**External**' for Reranking engine. Both are pointing at LiteLLM. Below is the log line I see when it happens: ``` rid_search:352 - Starting hybrid search for 2 queries in 1 collections... - {} open-webui | 2025-07-24 17:59:07.525 | ERROR | open_webui.retrieval.utils:query_doc_with_hybrid_search:191 - Error querying doc file-8279ba57-8316-4703-bec9-eed382b62736 with hybrid search: division by zero - {} open-webui | Traceback (most recent call last): open-webui | open-webui | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap open-webui | self._bootstrap_inner() open-webui | │ └ <function Thread._bootstrap_inner at 0x76137683c9a0> open-webui | └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner open-webui | self.run() open-webui | │ └ <function Thread.run at 0x76137683c680> open-webui | └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 982, in run open-webui | self._target(*self._args, **self._kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)> open-webui | │ │ │ └ (<weakref at 0x7613379466b0; to 'ThreadPoolExecutor' at 0x761338d40250>, <_queue.SimpleQueue object at 0x7613379645e0>, None,... open-webui | │ │ └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)> open-webui | │ └ <function _worker at 0x7613758feb60> open-webui | └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker open-webui | work_item.run() open-webui | │ └ <function _WorkItem.run at 0x7613758feca0> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run open-webui | result = self.fn(*self.args, **self.kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50> open-webui | │ │ │ └ ('file-8279ba57-8316-4703-bec9-eed382b62736', 'how to view contents of a file') open-webui | │ │ └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50> open-webui | │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x761337db0f40> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50> open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query open-webui | result = query_doc_with_hybrid_search( open-webui | └ <function query_doc_with_hybrid_search at 0x76133d91c220> open-webui | open-webui | > File "/app/backend/open_webui/retrieval/utils.py", line 128, in query_doc_with_hybrid_search open-webui | bm25_retriever = BM25Retriever.from_texts( open-webui | │ └ <classmethod(<function BM25Retriever.from_texts at 0x76133dadf420>)> open-webui | └ <class 'langchain_community.retrievers.bm25.BM25Retriever'> open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/langchain_community/retrievers/bm25.py", line 64, in from_texts open-webui | vectorizer = BM25Okapi(texts_processed, **bm25_params) open-webui | │ │ └ {} open-webui | │ └ [] open-webui | └ <class 'rank_bm25.BM25Okapi'> open-webui | File "/usr/local/lib/python3.11/site-packages/rank_bm25.py", line 83, in __init__ open-webui | super().__init__(corpus, tokenizer) open-webui | │ └ None open-webui | └ [] open-webui | File "/usr/local/lib/python3.11/site-packages/rank_bm25.py", line 27, in __init__ open-webui | nd = self._initialize(corpus) open-webui | │ │ └ [] open-webui | │ └ <function BM25._initialize at 0x7613389a7380> open-webui | └ <rank_bm25.BM25Okapi object at 0x7613391a7310> open-webui | File "/usr/local/lib/python3.11/site-packages/rank_bm25.py", line 52, in _initialize open-webui | self.avgdl = num_doc / self.corpus_size open-webui | │ │ │ │ └ 0 open-webui | │ │ │ └ <rank_bm25.BM25Okapi object at 0x7613391a7310> open-webui | │ │ └ 0 open-webui | │ └ 0 open-webui | └ <rank_bm25.BM25Okapi object at 0x7613391a7310> open-webui | open-webui | ZeroDivisionError: division by zero open-webui | 2025-07-24 17:59:07.527 | ERROR | open_webui.retrieval.utils:process_query:371 - Error when querying the collection with hybrid_search: division by zero - {} open-webui | Traceback (most recent call last): open-webui | open-webui | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap open-webui | self._bootstrap_inner() open-webui | │ └ <function Thread._bootstrap_inner at 0x76137683c9a0> open-webui | └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner open-webui | self.run() open-webui | │ └ <function Thread.run at 0x76137683c680> open-webui | └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 982, in run open-webui | self._target(*self._args, **self._kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)> open-webui | │ │ │ └ (<weakref at 0x7613379466b0; to 'ThreadPoolExecutor' at 0x761338d40250>, <_queue.SimpleQueue object at 0x7613379645e0>, None,... open-webui | │ │ └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)> open-webui | │ └ <function _worker at 0x7613758feb60> open-webui | └ <Thread(ThreadPoolExecutor-6_0, started 129823959668416)> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker open-webui | work_item.run() open-webui | │ └ <function _WorkItem.run at 0x7613758feca0> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run open-webui | result = self.fn(*self.args, **self.kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50> open-webui | │ │ │ └ ('file-8279ba57-8316-4703-bec9-eed382b62736', 'how to view contents of a file') open-webui | │ │ └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50> open-webui | │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x761337db0f40> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x761338d74e50> open-webui | open-webui | > File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query open-webui | result = query_doc_with_hybrid_search( open-webui | └ <function query_doc_with_hybrid_search at 0x76133d91c220> ```
Author
Owner

@sendmebits commented on GitHub (Jul 24, 2025):

Looking at other posters I think the common factor for impacted folks is OpenAI embeddings and External re-ranker, does that hold true with others impacted by this issue?

Example from dotmobo:

2025-07-21 09:29:32.465 | DEBUG    | open_webui.retrieval.utils:generate_openai_batch_embeddings:667 - generate_openai_batch_embeddings:model nomic batch size: 1 - {}
....
2025-07-21 09:29:32.507 | INFO     | open_webui.retrieval.models.external:predict:36 - ExternalReranker:predict:model bge-reranker - {}
@sendmebits commented on GitHub (Jul 24, 2025): Looking at other posters I think the common factor for impacted folks is **OpenAI embeddings** and **External re-ranker**, does that hold true with others impacted by this issue? Example from [dotmobo](https://github.com/dotmobo): ``` 2025-07-21 09:29:32.465 | DEBUG | open_webui.retrieval.utils:generate_openai_batch_embeddings:667 - generate_openai_batch_embeddings:model nomic batch size: 1 - {} .... 2025-07-21 09:29:32.507 | INFO | open_webui.retrieval.models.external:predict:36 - ExternalReranker:predict:model bge-reranker - {} ```
Author
Owner

@tjbck commented on GitHub (Jul 24, 2025):

@sendmebits the logs seem to indicate collection_results did not return any values from the vector search, could you confirm?

@tjbck commented on GitHub (Jul 24, 2025): @sendmebits the logs seem to indicate `collection_results` did not return any values from the vector search, could you confirm?
Author
Owner

@tjbck commented on GitHub (Jul 24, 2025):

Definitely cannot reproduce any of the issues mentioned here from our end, hybrid search works as intended in our testing setup. Could anyone provide a more detailed information on the exact configuration you're using, as well as the file being uploaded?

@tjbck commented on GitHub (Jul 24, 2025): Definitely cannot reproduce any of the issues mentioned here from our end, hybrid search works as intended in our testing setup. Could anyone provide a more detailed information on the exact configuration you're using, as well as the file being uploaded?
Author
Owner

@sendmebits commented on GitHub (Jul 24, 2025):

It's any file, to simplify the testing I've just been using a simple TEST.txt file with a paragraph of plain text in it.

For configuration - everything is on a single Docker host, single same Docker network, not a stack, separate containers.

It looks like its getting a 404 - '404 Client Error: Not Found for url: http://litellm:4000/v1'

ThreadPoolExecutor-6_0, started 129823959668416...
open-webui  | 
open-webui  | > File "/app/backend/open_webui/retrieval/models/external.py", line 62, in predict
open-webui  |     r.raise_for_status()
open-webui  |     │ └ <function Response.raise_for_status at 0x75ae34a2c860>
open-webui  |     └ <Response [404]>
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status
open-webui  |     raise HTTPError(http_error_msg, response=self)
open-webui  |           │         │                        └ <Response [404]>
open-webui  |           │         └ '404 Client Error: Not Found for url: http://litellm:4000/v1'
open-webui  |           └ <class 'requests.exceptions.HTTPError'>
open-webui  | 
open-webui  | requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://litellm:4000/v1
open-webui  | 2025-07-24 21:56:29.534 | ERROR    | open_webui.retrieval.utils:query_doc_with_hybrid_search:191 - Error querying doc file-2acf3182-d847-44b6-ad15-76c7220f2609 with hybrid search: 'NoneType' object has no attribute 'tolist' - {}

This config works on .15 but not on .16:

Image
@sendmebits commented on GitHub (Jul 24, 2025): It's any file, to simplify the testing I've just been using a simple TEST.txt file with a paragraph of plain text in it. For configuration - everything is on a single Docker host, single same Docker network, not a stack, separate containers. It looks like its getting a 404 - '404 Client Error: Not Found for url: http://litellm:4000/v1' ``` ThreadPoolExecutor-6_0, started 129823959668416... open-webui | open-webui | > File "/app/backend/open_webui/retrieval/models/external.py", line 62, in predict open-webui | r.raise_for_status() open-webui | │ └ <function Response.raise_for_status at 0x75ae34a2c860> open-webui | └ <Response [404]> open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status open-webui | raise HTTPError(http_error_msg, response=self) open-webui | │ │ └ <Response [404]> open-webui | │ └ '404 Client Error: Not Found for url: http://litellm:4000/v1' open-webui | └ <class 'requests.exceptions.HTTPError'> open-webui | open-webui | requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://litellm:4000/v1 open-webui | 2025-07-24 21:56:29.534 | ERROR | open_webui.retrieval.utils:query_doc_with_hybrid_search:191 - Error querying doc file-2acf3182-d847-44b6-ad15-76c7220f2609 with hybrid search: 'NoneType' object has no attribute 'tolist' - {} ``` This config works on .15 but not on .16: <img width="809" height="785" alt="Image" src="https://github.com/user-attachments/assets/9ab459d7-15dc-4ac4-a388-80a966ee1332" />
Author
Owner

@tjbck commented on GitHub (Jul 24, 2025):

@sendmebits could you confirm http://litellm:4000/v1 is reachable and bothcohere.embed-english-v3, amazon.rerank-v1:0 are present and accessible via api ?

@tjbck commented on GitHub (Jul 24, 2025): @sendmebits could you confirm http://litellm:4000/v1 is reachable and both`cohere.embed-english-v3`, `amazon.rerank-v1:0` are present and accessible via api ?
Author
Owner

@sendmebits commented on GitHub (Jul 24, 2025):

If I change the Reranking Engine URL to this 'http://litellm:4000/v1/rerank' it seems to be able to hybrid search the text documents now!

Note: This works just fine as http://litellm:4000/v1 on v0.6.15

@sendmebits commented on GitHub (Jul 24, 2025): If I change the Reranking Engine URL to this **'http://litellm:4000/v1/rerank'** it seems to be able to hybrid search the text documents now! Note: This works just fine as http://litellm:4000/v1 on v0.6.15
Author
Owner

@tjbck commented on GitHub (Jul 24, 2025):

@sendmebits strange, only http://litellm:4000/v1/rerank should've been supported if i'm not mistaken. Could you share your logs for 0.6.15?

@tjbck commented on GitHub (Jul 24, 2025): @sendmebits strange, only `http://litellm:4000/v1/rerank` should've been supported if i'm not mistaken. Could you share your logs for 0.6.15?
Author
Owner

@sendmebits commented on GitHub (Jul 24, 2025):

Here are logs for 0.6.15. I do still see the 404, but it works for some reason... there are some other errors ... not sure what they mean though...

pen-webui  | AttributeError: 'NoneType' object has no attribute 'tolist'
open-webui  | 2025-07-24 22:12:36.550 | ERROR    | open_webui.retrieval.utils:process_query:367 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist' - {}
open-webui  | Traceback (most recent call last):
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
open-webui  |     self._bootstrap_inner()
open-webui  |     │    └ <function Thread._bootstrap_inner at 0x7023d85049a0>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_0, started 123296901232320)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
open-webui  |     self.run()
open-webui  |     │    └ <function Thread.run at 0x7023d8504680>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_0, started 123296901232320)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 982, in run
open-webui  |     self._target(*self._args, **self._kwargs)
open-webui  |     │    │        │    │        │    └ {}
open-webui  |     │    │        │    │        └ <Thread(ThreadPoolExecutor-17_0, started 123296901232320)>
open-webui  |     │    │        │    └ (<weakref at 0x7023526a87c0; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,...
open-webui  |     │    │        └ <Thread(ThreadPoolExecutor-17_0, started 123296901232320)>
open-webui  |     │    └ <function _worker at 0x7023d741fd80>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_0, started 123296901232320)>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
open-webui  |     work_item.run()
open-webui  |     │         └ <function _WorkItem.run at 0x7023d741fec0>
open-webui  |     └ <concurrent.futures.thread._WorkItem object at 0x7023524bbf90>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
open-webui  |     result = self.fn(*self.args, **self.kwargs)
open-webui  |              │    │   │    │       │    └ {}
open-webui  |              │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x7023524bbf90>
open-webui  |              │    │   │    └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'contents of the file')
open-webui  |              │    │   └ <concurrent.futures.thread._WorkItem object at 0x7023524bbf90>
open-webui  |              │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560>
open-webui  |              └ <concurrent.futures.thread._WorkItem object at 0x7023524bbf90>
open-webui  | 
open-webui  | > File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query
open-webui  |     result = query_doc_with_hybrid_search(
open-webui  |              └ <function query_doc_with_hybrid_search at 0x702388f7e660>
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 188, in query_doc_with_hybrid_search
open-webui  |     raise e
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search
open-webui  |     result = compression_retriever.invoke(query)
open-webui  |              │                     │      └ 'contents of the file'
open-webui  |              │                     └ <function BaseRetriever.invoke at 0x7023894bfe20>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke
open-webui  |     result = self._get_relevant_documents(
open-webui  |              │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents
open-webui  |     compressed_docs = self.base_compressor.compress_documents(
open-webui  |                       │    │               └ <function RerankCompressor.compress_documents at 0x702388f7f740>
open-webui  |                       │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a...
open-webui  |                       └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 890, in compress_documents
open-webui  |     zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
open-webui  |         │          │                                 │                  └ None
open-webui  |         │          │                                 └ None
open-webui  |         │          └ None
open-webui  |         └ [Document(metadata={'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": ...
open-webui  | 
open-webui  | AttributeError: 'NoneType' object has no attribute 'tolist'
open-webui  | 2025-07-24 22:12:36.556 | INFO     | open_webui.retrieval.models.external:predict:36 - ExternalReranker:predict:model amazon.rerank-v1:0 - {}
open-webui  | 2025-07-24 22:12:36.556 | INFO     | open_webui.retrieval.models.external:predict:37 - ExternalReranker:predict:query methods to inspect file contents - {}
open-webui  | 2025-07-24 22:12:36.558 | INFO     | open_webui.retrieval.models.external:predict:36 - ExternalReranker:predict:model amazon.rerank-v1:0 - {}
open-webui  | 2025-07-24 22:12:36.559 | INFO     | open_webui.retrieval.models.external:predict:37 - ExternalReranker:predict:query how to view file contents - {}
open-webui  | 2025-07-24 22:12:36.561 | ERROR    | open_webui.retrieval.models.external:predict:59 - Error in external reranking: 404 Client Error: Not Found for url: http://litellm:4000/v1 - {}
open-webui  | Traceback (most recent call last):
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
open-webui  |     self._bootstrap_inner()
open-webui  |     │    └ <function Thread._bootstrap_inner at 0x7023d85049a0>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
open-webui  |     self.run()
open-webui  |     │    └ <function Thread.run at 0x7023d8504680>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 982, in run
open-webui  |     self._target(*self._args, **self._kwargs)
open-webui  |     │    │        │    │        │    └ {}
open-webui  |     │    │        │    │        └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |     │    │        │    └ (<weakref at 0x70237819f330; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,...
open-webui  |     │    │        └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |     │    └ <function _worker at 0x7023d741fd80>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
open-webui  |     work_item.run()
open-webui  |     │         └ <function _WorkItem.run at 0x7023d741fec0>
open-webui  |     └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
open-webui  |     result = self.fn(*self.args, **self.kwargs)
open-webui  |              │    │   │    │       │    └ {}
open-webui  |              │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  |              │    │   │    └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'methods to inspect file contents')
open-webui  |              │    │   └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  |              │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560>
open-webui  |              └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query
open-webui  |     result = query_doc_with_hybrid_search(
open-webui  |              └ <function query_doc_with_hybrid_search at 0x702388f7e660>
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search
open-webui  |     result = compression_retriever.invoke(query)
open-webui  |              │                     │      └ 'methods to inspect file contents'
open-webui  |              │                     └ <function BaseRetriever.invoke at 0x7023894bfe20>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke
open-webui  |     result = self._get_relevant_documents(
open-webui  |              │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents
open-webui  |     compressed_docs = self.base_compressor.compress_documents(
open-webui  |                       │    │               └ <function RerankCompressor.compress_documents at 0x702388f7f740>
open-webui  |                       │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a...
open-webui  |                       └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 877, in compress_documents
open-webui  |     scores = self.reranking_function.predict(
open-webui  |              │    │                  └ <function ExternalReranker.predict at 0x7023811645e0>
open-webui  |              │    └ <open_webui.retrieval.models.external.ExternalReranker object at 0x702351e9da90>
open-webui  |              └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a...
open-webui  | 
open-webui  | > File "/app/backend/open_webui/retrieval/models/external.py", line 48, in predict
open-webui  |     r.raise_for_status()
open-webui  |     │ └ <function Response.raise_for_status at 0x7023d4e12b60>
open-webui  |     └ <Response [404]>
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status
open-webui  |     raise HTTPError(http_error_msg, response=self)
open-webui  |           ���         │                        └ <Response [404]>
open-webui  |           │         └ '404 Client Error: Not Found for url: http://litellm:4000/v1'
open-webui  |           └ <class 'requests.exceptions.HTTPError'>
open-webui  | 
open-webui  | requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://litellm:4000/v1
open-webui  | 2025-07-24 22:12:36.565 | ERROR    | open_webui.retrieval.models.external:predict:59 - Error in external reranking: 404 Client Error: Not Found for url: http://litellm:4000/v1 - {}
open-webui  | Traceback (most recent call last):
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
open-webui  |     self._bootstrap_inner()
open-webui  |     │    └ <function Thread._bootstrap_inner at 0x7023d85049a0>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
open-webui  |     self.run()
open-webui  |     │    └ <function Thread.run at 0x7023d8504680>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 982, in run
open-webui  |     self._target(*self._args, **self._kwargs)
open-webui  |     │    │        │    │        │    └ {}
open-webui  |     │    │        │    │        └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |     │    │        │    └ (<weakref at 0x7023526ab3d0; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,...
open-webui  |     │    │        └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |     │    └ <function _worker at 0x7023d741fd80>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
open-webui  |     work_item.run()
open-webui  |     │         └ <function _WorkItem.run at 0x7023d741fec0>
open-webui  |     └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
open-webui  |     result = self.fn(*self.args, **self.kwargs)
open-webui  |              │    │   │    │       │    └ {}
open-webui  |              │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  |              │    │   │    └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'how to view file contents')
open-webui  |              │    │   └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  |              │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560>
open-webui  |              └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query
open-webui  |     result = query_doc_with_hybrid_search(
open-webui  |              └ <function query_doc_with_hybrid_search at 0x702388f7e660>
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search
open-webui  |     result = compression_retriever.invoke(query)
open-webui  |              │                     │      └ 'how to view file contents'
open-webui  |              │                     └ <function BaseRetriever.invoke at 0x7023894bfe20>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke
open-webui  |     result = self._get_relevant_documents(
open-webui  |              │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents
open-webui  |     compressed_docs = self.base_compressor.compress_documents(
open-webui  |                       │    │               └ <function RerankCompressor.compress_documents at 0x702388f7f740>
open-webui  |                       │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a...
open-webui  |                       └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 877, in compress_documents
open-webui  |     scores = self.reranking_function.predict(
open-webui  |              │    │                  └ <function ExternalReranker.predict at 0x7023811645e0>
open-webui  |              │    └ <open_webui.retrieval.models.external.ExternalReranker object at 0x702351e9da90>
open-webui  |              └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a...
open-webui  | 
open-webui  | > File "/app/backend/open_webui/retrieval/models/external.py", line 48, in predict
open-webui  |     r.raise_for_status()
open-webui  |     │ └ <function Response.raise_for_status at 0x7023d4e12b60>
open-webui  |     └ <Response [404]>
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status
open-webui  |     raise HTTPError(http_error_msg, response=self)
open-webui  |           │         │                        └ <Response [404]>
open-webui  |           │         └ '404 Client Error: Not Found for url: http://litellm:4000/v1'
open-webui  |           └ <class 'requests.exceptions.HTTPError'>
open-webui  | 
open-webui  | requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://litellm:4000/v1
open-webui  | 2025-07-24 22:12:36.567 | ERROR    | open_webui.retrieval.utils:query_doc_with_hybrid_search:187 - Error querying doc file-09bc8315-e798-4658-b047-4a75a4527650 with hybrid search: 'NoneType' object has no attribute 'tolist' - {}
open-webui  | Traceback (most recent call last):
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
open-webui  |     self._bootstrap_inner()
open-webui  |     │    └ <function Thread._bootstrap_inner at 0x7023d85049a0>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
open-webui  |     self.run()
open-webui  |     │    └ <function Thread.run at 0x7023d8504680>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 982, in run
open-webui  |     self._target(*self._args, **self._kwargs)
open-webui  |     │    │        │    │        │    └ {}
open-webui  |     │    │        │    │        └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |     │    │        │    └ (<weakref at 0x70237819f330; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,...
open-webui  |     │    │        └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |     │    └ <function _worker at 0x7023d741fd80>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
open-webui  |     work_item.run()
open-webui  |     │         └ <function _WorkItem.run at 0x7023d741fec0>
open-webui  |     └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
open-webui  |     result = self.fn(*self.args, **self.kwargs)
open-webui  |              │    │   │    │       │    └ {}
open-webui  |              │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  |              │    │   │    └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'methods to inspect file contents')
open-webui  |              │    │   └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  |              │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560>
open-webui  |              └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query
open-webui  |     result = query_doc_with_hybrid_search(
open-webui  |              └ <function query_doc_with_hybrid_search at 0x702388f7e660>
open-webui  | 
open-webui  | > File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search
open-webui  |     result = compression_retriever.invoke(query)
open-webui  |              │                     │      └ 'methods to inspect file contents'
open-webui  |              │                     └ <function BaseRetriever.invoke at 0x7023894bfe20>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke
open-webui  |     result = self._get_relevant_documents(
open-webui  |              │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents
open-webui  |     compressed_docs = self.base_compressor.compress_documents(
open-webui  |                       │    │               └ <function RerankCompressor.compress_documents at 0x702388f7f740>
open-webui  |                       │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a...
open-webui  |                       └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 890, in compress_documents
open-webui  |     zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
open-webui  |         │          │                                 │                  └ None
open-webui  |         │          │                                 └ None
open-webui  |         │          └ None
open-webui  |         └ [Document(metadata={'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": ...
open-webui  | 
open-webui  | AttributeError: 'NoneType' object has no attribute 'tolist'
open-webui  | 2025-07-24 22:12:36.569 | ERROR    | open_webui.retrieval.utils:query_doc_with_hybrid_search:187 - Error querying doc file-09bc8315-e798-4658-b047-4a75a4527650 with hybrid search: 'NoneType' object has no attribute 'tolist' - {}
open-webui  | Traceback (most recent call last):
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
open-webui  |     self._bootstrap_inner()
open-webui  |     │    └ <function Thread._bootstrap_inner at 0x7023d85049a0>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
open-webui  |     self.run()
open-webui  |     │    └ <function Thread.run at 0x7023d8504680>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 982, in run
open-webui  |     self._target(*self._args, **self._kwargs)
open-webui  |     │    │        │    │        │    └ {}
open-webui  |     │    │        │    │        └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |     │    │        │    └ (<weakref at 0x7023526ab3d0; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,...
open-webui  |     │    │        └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |     │    └ <function _worker at 0x7023d741fd80>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
open-webui  |     work_item.run()
open-webui  |     │         └ <function _WorkItem.run at 0x7023d741fec0>
open-webui  |     └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
open-webui  |     result = self.fn(*self.args, **self.kwargs)
open-webui  |              │    │   │    │       │    └ {}
open-webui  |              │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  |              │    │   │    └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'how to view file contents')
open-webui  |              │    │   └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  |              │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560>
open-webui  |              └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query
open-webui  |     result = query_doc_with_hybrid_search(
open-webui  |              └ <function query_doc_with_hybrid_search at 0x702388f7e660>
open-webui  | 
open-webui  | > File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search
open-webui  |     result = compression_retriever.invoke(query)
open-webui  |              │                     │      └ 'how to view file contents'
open-webui  |              │                     └ <function BaseRetriever.invoke at 0x7023894bfe20>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke
open-webui  |     result = self._get_relevant_documents(
open-webui  |              │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents
open-webui  |     compressed_docs = self.base_compressor.compress_documents(
open-webui  |                       │    │               └ <function RerankCompressor.compress_documents at 0x702388f7f740>
open-webui  |                       │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a...
open-webui  |                       └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 890, in compress_documents
open-webui  |     zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
open-webui  |         │          │                                 │                  └ None
open-webui  |         │          │                                 └ None
open-webui  |         │          └ None
open-webui  |         └ [Document(metadata={'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": ...
open-webui  | 
open-webui  | AttributeError: 'NoneType' object has no attribute 'tolist'
open-webui  | 2025-07-24 22:12:36.571 | ERROR    | open_webui.retrieval.utils:process_query:367 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist' - {}
open-webui  | Traceback (most recent call last):
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
open-webui  |     self._bootstrap_inner()
open-webui  |     │    └ <function Thread._bootstrap_inner at 0x7023d85049a0>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
open-webui  |     self.run()
open-webui  |     │    └ <function Thread.run at 0x7023d8504680>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 982, in run
open-webui  |     self._target(*self._args, **self._kwargs)
open-webui  |     │    │        │    │        │    └ {}
open-webui  |     │    │        │    │        └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |     │    │        │    └ (<weakref at 0x70237819f330; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,...
open-webui  |     │    │        └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |     │    └ <function _worker at 0x7023d741fd80>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
open-webui  |     work_item.run()
open-webui  |     │         └ <function _WorkItem.run at 0x7023d741fec0>
open-webui  |     └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
open-webui  |     result = self.fn(*self.args, **self.kwargs)
open-webui  |              │    │   │    │       │    └ {}
open-webui  |              │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  |              │    │   │    └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'methods to inspect file contents')
open-webui  |              │    │   └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  |              │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560>
open-webui  |              └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990>
open-webui  | 
open-webui  | > File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query
open-webui  |     result = query_doc_with_hybrid_search(
open-webui  |              └ <function query_doc_with_hybrid_search at 0x702388f7e660>
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 188, in query_doc_with_hybrid_search
open-webui  |     raise e
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search
open-webui  |     result = compression_retriever.invoke(query)
open-webui  |              │                     │      └ 'methods to inspect file contents'
open-webui  |              │                     └ <function BaseRetriever.invoke at 0x7023894bfe20>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke
open-webui  |     result = self._get_relevant_documents(
open-webui  |              │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents
open-webui  |     compressed_docs = self.base_compressor.compress_documents(
open-webui  |                       │    │               └ <function RerankCompressor.compress_documents at 0x702388f7f740>
open-webui  |                       │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a...
open-webui  |                       └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 890, in compress_documents
open-webui  |     zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
open-webui  |         │          │                                 │                  └ None
open-webui  |         │          │                                 └ None
open-webui  |         │          └ None
open-webui  |         └ [Document(metadata={'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": ...
open-webui  | 
open-webui  | AttributeError: 'NoneType' object has no attribute 'tolist'
open-webui  | 2025-07-24 22:12:36.573 | ERROR    | open_webui.retrieval.utils:process_query:367 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist' - {}
open-webui  | Traceback (most recent call last):
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
open-webui  |     self._bootstrap_inner()
open-webui  |     │    └ <function Thread._bootstrap_inner at 0x7023d85049a0>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
open-webui  |     self.run()
open-webui  |     │    └ <function Thread.run at 0x7023d8504680>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |   File "/usr/local/lib/python3.11/threading.py", line 982, in run
open-webui  |     self._target(*self._args, **self._kwargs)
open-webui  |     │    │        │    │        │    └ {}
open-webui  |     │    │        │    │        └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |     │    │        │    └ (<weakref at 0x7023526ab3d0; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,...
open-webui  |     │    │        └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |     │    └ <function _worker at 0x7023d741fd80>
open-webui  |     └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
open-webui  |     work_item.run()
open-webui  |     │         └ <function _WorkItem.run at 0x7023d741fec0>
open-webui  |     └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  |   File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
open-webui  |     result = self.fn(*self.args, **self.kwargs)
open-webui  |              │    │   │    │       │    └ {}
open-webui  |              │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  |              │    │   │    └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'how to view file contents')
open-webui  |              │    │   └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  |              │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560>
open-webui  |              └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90>
open-webui  | 
open-webui  | > File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query
open-webui  |     result = query_doc_with_hybrid_search(
open-webui  |              └ <function query_doc_with_hybrid_search at 0x702388f7e660>
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 188, in query_doc_with_hybrid_search
open-webui  |     raise e
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search
open-webui  |     result = compression_retriever.invoke(query)
open-webui  |              │                     │      └ 'how to view file contents'
open-webui  |              │                     └ <function BaseRetriever.invoke at 0x7023894bfe20>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke
open-webui  |     result = self._get_relevant_documents(
open-webui  |              │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420>
open-webui  |              └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  |   File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents
open-webui  |     compressed_docs = self.base_compressor.compress_documents(
open-webui  |                       │    │               └ <function RerankCompressor.compress_documents at 0x702388f7f740>
open-webui  |                       │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a...
open-webui  |                       └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
open-webui  | 
open-webui  |   File "/app/backend/open_webui/retrieval/utils.py", line 890, in compress_documents
open-webui  |     zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
open-webui  |         │          │                                 │                  └ None
open-webui  |         │          │                                 └ None
open-webui  |         │          └ None
open-webui  |         └ [Document(metadata={'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": ...
open-webui  | 
open-webui  | AttributeError: 'NoneType' object has no attribute 'tolist'
open-webui  | 2025-07-24 22:12:36.786 | INFO     | open_webui.retrieval.utils:query_doc:89 - query_doc:result [['6b66303b-fdb7-4205-862c-d4a9df7756cf', 'f4953968-c276-4e8a-a467-07700dbbdc4c', 'd309bacc-50c3-493b-adb4-d00d391e020f', 'c399b97d-7e2e-420a-a3e8-1dc655e46b11', 'b5e4d04f-23b8-4ca6-bbff-9fb7e9e2fdf5', 'abce3182-747d-418a-8254-1dd1d1ecce81', '40f0bd30-b7cc-49d8-9deb-ec007259bddc', '05df30a7-9620-4876-879a-fd308b3b5b04', 'f8faa2dd-19dc-40c7-9248-6abc1047edc6', 'f7be2591-adb2-4fb6-968f-35298787a668']] [[{'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 6327}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 1866}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 15619}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 10953}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 3651}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 17434}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 7292}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 2732}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 11926}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 8189}]] - {}
open-webui  | 2025-07-24 22:12:36.787 | INFO     | open_webui.retrieval.utils:query_doc:89 - query_doc:result [['6b66303b-fdb7-4205-862c-d4a9df7756cf', 'c5ace266-ae90-469b-8045-b4050c30cd4c', 'f4953968-c276-4e8a-a467-07700dbbdc4c', 'd309bacc-50c3-493b-adb4-d00d391e020f', 'c399b97d-7e2e-420a-a3e8-1dc655e46b11', '05df30a7-9620-4876-879a-fd308b3b5b04', 'abce3182-747d-418a-8254-1dd1d1ecce81', 'b5e4d04f-23b8-4ca6-bbff-9fb7e9e2fdf5', '40f0bd30-b7cc-49d8-9deb-ec007259bddc', 'f6baf0ef-92a3-4cd4-9dec-eef2679ae4db']] [[{'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 6327}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 0}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 1866}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 15619}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 10953}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 2732}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 17434}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 3651}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 7292}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 12896}]] - {}
open-webui  | 2025-07-24 22:12:36.788 | INFO     | open_webui.retrieval.utils:query_doc:89 - query_doc:result [['6b66303b-fdb7-4205-862c-d4a9df7756cf', 'f4953968-c276-4e8a-a467-07700dbbdc4c', 'd309bacc-50c3-493b-adb4-d00d391e020f', 'c399b97d-7e2e-420a-a3e8-1dc655e46b11', 'c5ace266-ae90-469b-8045-b4050c30cd4c', 'b5e4d04f-23b8-4ca6-bbff-9fb7e9e2fdf5', '05df30a7-9620-4876-879a-fd308b3b5b04', '40f0bd30-b7cc-49d8-9deb-ec007259bddc', 'f8faa2dd-19dc-40c7-9248-6abc1047edc6', 'c0b9159f-cb19-4836-bcd7-088c95749b2e']] [[{'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 6327}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 1866}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 15619}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 10953}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 0}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 3651}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 2732}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 7292}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 11926}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 9086}]] - {}
@sendmebits commented on GitHub (Jul 24, 2025): Here are logs for 0.6.15. I do still see the 404, but it works for some reason... there are some other errors ... not sure what they mean though... ``` pen-webui | AttributeError: 'NoneType' object has no attribute 'tolist' open-webui | 2025-07-24 22:12:36.550 | ERROR | open_webui.retrieval.utils:process_query:367 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist' - {} open-webui | Traceback (most recent call last): open-webui | open-webui | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap open-webui | self._bootstrap_inner() open-webui | │ └ <function Thread._bootstrap_inner at 0x7023d85049a0> open-webui | └ <Thread(ThreadPoolExecutor-17_0, started 123296901232320)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner open-webui | self.run() open-webui | │ └ <function Thread.run at 0x7023d8504680> open-webui | └ <Thread(ThreadPoolExecutor-17_0, started 123296901232320)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 982, in run open-webui | self._target(*self._args, **self._kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <Thread(ThreadPoolExecutor-17_0, started 123296901232320)> open-webui | │ │ │ └ (<weakref at 0x7023526a87c0; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,... open-webui | │ │ └ <Thread(ThreadPoolExecutor-17_0, started 123296901232320)> open-webui | │ └ <function _worker at 0x7023d741fd80> open-webui | └ <Thread(ThreadPoolExecutor-17_0, started 123296901232320)> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker open-webui | work_item.run() open-webui | │ └ <function _WorkItem.run at 0x7023d741fec0> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x7023524bbf90> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run open-webui | result = self.fn(*self.args, **self.kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x7023524bbf90> open-webui | │ │ │ └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'contents of the file') open-webui | │ │ └ <concurrent.futures.thread._WorkItem object at 0x7023524bbf90> open-webui | │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x7023524bbf90> open-webui | open-webui | > File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query open-webui | result = query_doc_with_hybrid_search( open-webui | └ <function query_doc_with_hybrid_search at 0x702388f7e660> open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 188, in query_doc_with_hybrid_search open-webui | raise e open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search open-webui | result = compression_retriever.invoke(query) open-webui | │ │ └ 'contents of the file' open-webui | │ └ <function BaseRetriever.invoke at 0x7023894bfe20> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke open-webui | result = self._get_relevant_documents( open-webui | │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents open-webui | compressed_docs = self.base_compressor.compress_documents( open-webui | │ │ └ <function RerankCompressor.compress_documents at 0x702388f7f740> open-webui | │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a... open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 890, in compress_documents open-webui | zip(documents, scores.tolist() if not isinstance(scores, list) else scores) open-webui | │ │ │ └ None open-webui | │ │ └ None open-webui | │ └ None open-webui | └ [Document(metadata={'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": ... open-webui | open-webui | AttributeError: 'NoneType' object has no attribute 'tolist' open-webui | 2025-07-24 22:12:36.556 | INFO | open_webui.retrieval.models.external:predict:36 - ExternalReranker:predict:model amazon.rerank-v1:0 - {} open-webui | 2025-07-24 22:12:36.556 | INFO | open_webui.retrieval.models.external:predict:37 - ExternalReranker:predict:query methods to inspect file contents - {} open-webui | 2025-07-24 22:12:36.558 | INFO | open_webui.retrieval.models.external:predict:36 - ExternalReranker:predict:model amazon.rerank-v1:0 - {} open-webui | 2025-07-24 22:12:36.559 | INFO | open_webui.retrieval.models.external:predict:37 - ExternalReranker:predict:query how to view file contents - {} open-webui | 2025-07-24 22:12:36.561 | ERROR | open_webui.retrieval.models.external:predict:59 - Error in external reranking: 404 Client Error: Not Found for url: http://litellm:4000/v1 - {} open-webui | Traceback (most recent call last): open-webui | open-webui | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap open-webui | self._bootstrap_inner() open-webui | │ └ <function Thread._bootstrap_inner at 0x7023d85049a0> open-webui | └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner open-webui | self.run() open-webui | │ └ <function Thread.run at 0x7023d8504680> open-webui | └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 982, in run open-webui | self._target(*self._args, **self._kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | │ │ │ └ (<weakref at 0x70237819f330; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,... open-webui | │ │ └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | │ └ <function _worker at 0x7023d741fd80> open-webui | └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker open-webui | work_item.run() open-webui | │ └ <function _WorkItem.run at 0x7023d741fec0> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run open-webui | result = self.fn(*self.args, **self.kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | │ │ │ └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'methods to inspect file contents') open-webui | │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query open-webui | result = query_doc_with_hybrid_search( open-webui | └ <function query_doc_with_hybrid_search at 0x702388f7e660> open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search open-webui | result = compression_retriever.invoke(query) open-webui | │ │ └ 'methods to inspect file contents' open-webui | │ └ <function BaseRetriever.invoke at 0x7023894bfe20> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke open-webui | result = self._get_relevant_documents( open-webui | │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents open-webui | compressed_docs = self.base_compressor.compress_documents( open-webui | │ │ └ <function RerankCompressor.compress_documents at 0x702388f7f740> open-webui | │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a... open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 877, in compress_documents open-webui | scores = self.reranking_function.predict( open-webui | │ │ └ <function ExternalReranker.predict at 0x7023811645e0> open-webui | │ └ <open_webui.retrieval.models.external.ExternalReranker object at 0x702351e9da90> open-webui | └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a... open-webui | open-webui | > File "/app/backend/open_webui/retrieval/models/external.py", line 48, in predict open-webui | r.raise_for_status() open-webui | │ └ <function Response.raise_for_status at 0x7023d4e12b60> open-webui | └ <Response [404]> open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status open-webui | raise HTTPError(http_error_msg, response=self) open-webui | ��� │ └ <Response [404]> open-webui | │ └ '404 Client Error: Not Found for url: http://litellm:4000/v1' open-webui | └ <class 'requests.exceptions.HTTPError'> open-webui | open-webui | requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://litellm:4000/v1 open-webui | 2025-07-24 22:12:36.565 | ERROR | open_webui.retrieval.models.external:predict:59 - Error in external reranking: 404 Client Error: Not Found for url: http://litellm:4000/v1 - {} open-webui | Traceback (most recent call last): open-webui | open-webui | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap open-webui | self._bootstrap_inner() open-webui | │ └ <function Thread._bootstrap_inner at 0x7023d85049a0> open-webui | └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner open-webui | self.run() open-webui | │ └ <function Thread.run at 0x7023d8504680> open-webui | └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 982, in run open-webui | self._target(*self._args, **self._kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | │ │ │ └ (<weakref at 0x7023526ab3d0; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,... open-webui | │ │ └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | │ └ <function _worker at 0x7023d741fd80> open-webui | └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker open-webui | work_item.run() open-webui | │ └ <function _WorkItem.run at 0x7023d741fec0> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run open-webui | result = self.fn(*self.args, **self.kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | │ │ │ └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'how to view file contents') open-webui | │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query open-webui | result = query_doc_with_hybrid_search( open-webui | └ <function query_doc_with_hybrid_search at 0x702388f7e660> open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search open-webui | result = compression_retriever.invoke(query) open-webui | │ │ └ 'how to view file contents' open-webui | │ └ <function BaseRetriever.invoke at 0x7023894bfe20> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke open-webui | result = self._get_relevant_documents( open-webui | │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents open-webui | compressed_docs = self.base_compressor.compress_documents( open-webui | │ │ └ <function RerankCompressor.compress_documents at 0x702388f7f740> open-webui | │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a... open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 877, in compress_documents open-webui | scores = self.reranking_function.predict( open-webui | │ │ └ <function ExternalReranker.predict at 0x7023811645e0> open-webui | │ └ <open_webui.retrieval.models.external.ExternalReranker object at 0x702351e9da90> open-webui | └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a... open-webui | open-webui | > File "/app/backend/open_webui/retrieval/models/external.py", line 48, in predict open-webui | r.raise_for_status() open-webui | │ └ <function Response.raise_for_status at 0x7023d4e12b60> open-webui | └ <Response [404]> open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status open-webui | raise HTTPError(http_error_msg, response=self) open-webui | │ │ └ <Response [404]> open-webui | │ └ '404 Client Error: Not Found for url: http://litellm:4000/v1' open-webui | └ <class 'requests.exceptions.HTTPError'> open-webui | open-webui | requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://litellm:4000/v1 open-webui | 2025-07-24 22:12:36.567 | ERROR | open_webui.retrieval.utils:query_doc_with_hybrid_search:187 - Error querying doc file-09bc8315-e798-4658-b047-4a75a4527650 with hybrid search: 'NoneType' object has no attribute 'tolist' - {} open-webui | Traceback (most recent call last): open-webui | open-webui | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap open-webui | self._bootstrap_inner() open-webui | │ └ <function Thread._bootstrap_inner at 0x7023d85049a0> open-webui | └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner open-webui | self.run() open-webui | │ └ <function Thread.run at 0x7023d8504680> open-webui | └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 982, in run open-webui | self._target(*self._args, **self._kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | │ │ │ └ (<weakref at 0x70237819f330; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,... open-webui | │ │ └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | │ └ <function _worker at 0x7023d741fd80> open-webui | └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker open-webui | work_item.run() open-webui | │ └ <function _WorkItem.run at 0x7023d741fec0> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run open-webui | result = self.fn(*self.args, **self.kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | │ │ │ └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'methods to inspect file contents') open-webui | │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query open-webui | result = query_doc_with_hybrid_search( open-webui | └ <function query_doc_with_hybrid_search at 0x702388f7e660> open-webui | open-webui | > File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search open-webui | result = compression_retriever.invoke(query) open-webui | │ │ └ 'methods to inspect file contents' open-webui | │ └ <function BaseRetriever.invoke at 0x7023894bfe20> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke open-webui | result = self._get_relevant_documents( open-webui | │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents open-webui | compressed_docs = self.base_compressor.compress_documents( open-webui | │ │ └ <function RerankCompressor.compress_documents at 0x702388f7f740> open-webui | │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a... open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 890, in compress_documents open-webui | zip(documents, scores.tolist() if not isinstance(scores, list) else scores) open-webui | │ │ │ └ None open-webui | │ │ └ None open-webui | │ └ None open-webui | └ [Document(metadata={'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": ... open-webui | open-webui | AttributeError: 'NoneType' object has no attribute 'tolist' open-webui | 2025-07-24 22:12:36.569 | ERROR | open_webui.retrieval.utils:query_doc_with_hybrid_search:187 - Error querying doc file-09bc8315-e798-4658-b047-4a75a4527650 with hybrid search: 'NoneType' object has no attribute 'tolist' - {} open-webui | Traceback (most recent call last): open-webui | open-webui | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap open-webui | self._bootstrap_inner() open-webui | │ └ <function Thread._bootstrap_inner at 0x7023d85049a0> open-webui | └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner open-webui | self.run() open-webui | │ └ <function Thread.run at 0x7023d8504680> open-webui | └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 982, in run open-webui | self._target(*self._args, **self._kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | │ │ │ └ (<weakref at 0x7023526ab3d0; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,... open-webui | │ │ └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | │ └ <function _worker at 0x7023d741fd80> open-webui | └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker open-webui | work_item.run() open-webui | │ └ <function _WorkItem.run at 0x7023d741fec0> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run open-webui | result = self.fn(*self.args, **self.kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | │ │ │ └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'how to view file contents') open-webui | │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query open-webui | result = query_doc_with_hybrid_search( open-webui | └ <function query_doc_with_hybrid_search at 0x702388f7e660> open-webui | open-webui | > File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search open-webui | result = compression_retriever.invoke(query) open-webui | │ │ └ 'how to view file contents' open-webui | │ └ <function BaseRetriever.invoke at 0x7023894bfe20> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke open-webui | result = self._get_relevant_documents( open-webui | │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents open-webui | compressed_docs = self.base_compressor.compress_documents( open-webui | │ │ └ <function RerankCompressor.compress_documents at 0x702388f7f740> open-webui | │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a... open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 890, in compress_documents open-webui | zip(documents, scores.tolist() if not isinstance(scores, list) else scores) open-webui | │ │ │ └ None open-webui | │ │ └ None open-webui | │ └ None open-webui | └ [Document(metadata={'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": ... open-webui | open-webui | AttributeError: 'NoneType' object has no attribute 'tolist' open-webui | 2025-07-24 22:12:36.571 | ERROR | open_webui.retrieval.utils:process_query:367 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist' - {} open-webui | Traceback (most recent call last): open-webui | open-webui | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap open-webui | self._bootstrap_inner() open-webui | │ └ <function Thread._bootstrap_inner at 0x7023d85049a0> open-webui | └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner open-webui | self.run() open-webui | │ └ <function Thread.run at 0x7023d8504680> open-webui | └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 982, in run open-webui | self._target(*self._args, **self._kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | │ │ │ └ (<weakref at 0x70237819f330; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,... open-webui | │ │ └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | │ └ <function _worker at 0x7023d741fd80> open-webui | └ <Thread(ThreadPoolExecutor-17_2, started 123297696036544)> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker open-webui | work_item.run() open-webui | │ └ <function _WorkItem.run at 0x7023d741fec0> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run open-webui | result = self.fn(*self.args, **self.kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | │ │ │ └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'methods to inspect file contents') open-webui | │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3d990> open-webui | open-webui | > File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query open-webui | result = query_doc_with_hybrid_search( open-webui | └ <function query_doc_with_hybrid_search at 0x702388f7e660> open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 188, in query_doc_with_hybrid_search open-webui | raise e open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search open-webui | result = compression_retriever.invoke(query) open-webui | │ │ └ 'methods to inspect file contents' open-webui | │ └ <function BaseRetriever.invoke at 0x7023894bfe20> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke open-webui | result = self._get_relevant_documents( open-webui | │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents open-webui | compressed_docs = self.base_compressor.compress_documents( open-webui | │ │ └ <function RerankCompressor.compress_documents at 0x702388f7f740> open-webui | │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a... open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 890, in compress_documents open-webui | zip(documents, scores.tolist() if not isinstance(scores, list) else scores) open-webui | │ │ │ └ None open-webui | │ │ └ None open-webui | │ └ None open-webui | └ [Document(metadata={'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": ... open-webui | open-webui | AttributeError: 'NoneType' object has no attribute 'tolist' open-webui | 2025-07-24 22:12:36.573 | ERROR | open_webui.retrieval.utils:process_query:367 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist' - {} open-webui | Traceback (most recent call last): open-webui | open-webui | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap open-webui | self._bootstrap_inner() open-webui | │ └ <function Thread._bootstrap_inner at 0x7023d85049a0> open-webui | └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner open-webui | self.run() open-webui | │ └ <function Thread.run at 0x7023d8504680> open-webui | └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | File "/usr/local/lib/python3.11/threading.py", line 982, in run open-webui | self._target(*self._args, **self._kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | │ │ │ └ (<weakref at 0x7023526ab3d0; to 'ThreadPoolExecutor' at 0x70237e1607d0>, <_queue.SimpleQueue object at 0x7023526a84f0>, None,... open-webui | │ │ └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | │ └ <function _worker at 0x7023d741fd80> open-webui | └ <Thread(ThreadPoolExecutor-17_1, started 123297687643840)> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker open-webui | work_item.run() open-webui | │ └ <function _WorkItem.run at 0x7023d741fec0> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run open-webui | result = self.fn(*self.args, **self.kwargs) open-webui | │ │ │ │ │ └ {} open-webui | │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | │ │ │ └ ('file-09bc8315-e798-4658-b047-4a75a4527650', 'how to view file contents') open-webui | │ │ └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x70237e95b560> open-webui | └ <concurrent.futures.thread._WorkItem object at 0x702351e3fb90> open-webui | open-webui | > File "/app/backend/open_webui/retrieval/utils.py", line 354, in process_query open-webui | result = query_doc_with_hybrid_search( open-webui | └ <function query_doc_with_hybrid_search at 0x702388f7e660> open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 188, in query_doc_with_hybrid_search open-webui | raise e open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 161, in query_doc_with_hybrid_search open-webui | result = compression_retriever.invoke(query) open-webui | │ │ └ 'how to view file contents' open-webui | │ └ <function BaseRetriever.invoke at 0x7023894bfe20> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 259, in invoke open-webui | result = self._get_relevant_documents( open-webui | │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7023894bf420> open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 48, in _get_relevant_documents open-webui | compressed_docs = self.base_compressor.compress_documents( open-webui | │ │ └ <function RerankCompressor.compress_documents at 0x702388f7f740> open-webui | │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7023533a... open-webui | └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... open-webui | open-webui | File "/app/backend/open_webui/retrieval/utils.py", line 890, in compress_documents open-webui | zip(documents, scores.tolist() if not isinstance(scores, list) else scores) open-webui | │ │ │ └ None open-webui | │ │ └ None open-webui | │ └ None open-webui | └ [Document(metadata={'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": ... open-webui | open-webui | AttributeError: 'NoneType' object has no attribute 'tolist' open-webui | 2025-07-24 22:12:36.786 | INFO | open_webui.retrieval.utils:query_doc:89 - query_doc:result [['6b66303b-fdb7-4205-862c-d4a9df7756cf', 'f4953968-c276-4e8a-a467-07700dbbdc4c', 'd309bacc-50c3-493b-adb4-d00d391e020f', 'c399b97d-7e2e-420a-a3e8-1dc655e46b11', 'b5e4d04f-23b8-4ca6-bbff-9fb7e9e2fdf5', 'abce3182-747d-418a-8254-1dd1d1ecce81', '40f0bd30-b7cc-49d8-9deb-ec007259bddc', '05df30a7-9620-4876-879a-fd308b3b5b04', 'f8faa2dd-19dc-40c7-9248-6abc1047edc6', 'f7be2591-adb2-4fb6-968f-35298787a668']] [[{'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 6327}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 1866}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 15619}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 10953}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 3651}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 17434}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 7292}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 2732}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 11926}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 8189}]] - {} open-webui | 2025-07-24 22:12:36.787 | INFO | open_webui.retrieval.utils:query_doc:89 - query_doc:result [['6b66303b-fdb7-4205-862c-d4a9df7756cf', 'c5ace266-ae90-469b-8045-b4050c30cd4c', 'f4953968-c276-4e8a-a467-07700dbbdc4c', 'd309bacc-50c3-493b-adb4-d00d391e020f', 'c399b97d-7e2e-420a-a3e8-1dc655e46b11', '05df30a7-9620-4876-879a-fd308b3b5b04', 'abce3182-747d-418a-8254-1dd1d1ecce81', 'b5e4d04f-23b8-4ca6-bbff-9fb7e9e2fdf5', '40f0bd30-b7cc-49d8-9deb-ec007259bddc', 'f6baf0ef-92a3-4cd4-9dec-eef2679ae4db']] [[{'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 6327}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 0}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 1866}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 15619}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 10953}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 2732}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 17434}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 3651}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 7292}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 12896}]] - {} open-webui | 2025-07-24 22:12:36.788 | INFO | open_webui.retrieval.utils:query_doc:89 - query_doc:result [['6b66303b-fdb7-4205-862c-d4a9df7756cf', 'f4953968-c276-4e8a-a467-07700dbbdc4c', 'd309bacc-50c3-493b-adb4-d00d391e020f', 'c399b97d-7e2e-420a-a3e8-1dc655e46b11', 'c5ace266-ae90-469b-8045-b4050c30cd4c', 'b5e4d04f-23b8-4ca6-bbff-9fb7e9e2fdf5', '05df30a7-9620-4876-879a-fd308b3b5b04', '40f0bd30-b7cc-49d8-9deb-ec007259bddc', 'f8faa2dd-19dc-40c7-9248-6abc1047edc6', 'c0b9159f-cb19-4836-bcd7-088c95749b2e']] [[{'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 6327}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 1866}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 15619}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 10953}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 0}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 3651}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 2732}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 7292}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 11926}, {'created_by': '026e8f23-448a-4c58-ab9c-dcb4b4ee50a0', 'embedding_config': '{"engine": "openai", "model": "cohere.embed-english-v3"}', 'file_id': '09bc8315-e798-4658-b047-4a75a4527650', 'hash': 'a626995a59d793c4d0858ed61a34288998e2768e1f0c8cacd46320ff6db2dcfc', 'name': 'TextFile.txt', 'source': 'TextFile.txt', 'start_index': 9086}]] - {} ```
Author
Owner

@tjbck commented on GitHub (Jul 24, 2025):

@sendmebits the logs indicate it was never configured correctly in the first place, and you weren't actually using the external reranker, this seems unrelated to the issue(s) being discussed here.

@tjbck commented on GitHub (Jul 24, 2025): @sendmebits the logs indicate it was never configured correctly in the first place, and you weren't actually using the external reranker, this seems unrelated to the issue(s) being discussed here.
Author
Owner

@aldodelgado commented on GitHub (Jul 24, 2025):

Thanks for the tips @cyberclaw03. I clicked on the "Reindex all existing documents" button in the administrator page and it works again with the 0.6.18 @tjbck .

This solved the issue for me. I'm also using qdrant on open-webui version 0.6.18.

@aldodelgado commented on GitHub (Jul 24, 2025): > Thanks for the tips [@cyberclaw03](https://github.com/cyberclaw03). I clicked on the "Reindex all existing documents" button in the administrator page and it works again with the 0.6.18 [@tjbck](https://github.com/tjbck) . This solved the issue for me. I'm also using qdrant on open-webui version 0.6.18.
Author
Owner

@zbejas commented on GitHub (Jul 30, 2025):

I have the same issue. I use Ollama models for reranking, and RAG only works if I disable hybrid search. I have tried swapping out models, and reindexing, but nothing works with hybrid search enabled. The only change since it was working as intended was an update to the OpenWebUI.

@zbejas commented on GitHub (Jul 30, 2025): I have the same issue. I use Ollama models for reranking, and RAG only works if I disable hybrid search. I have tried swapping out models, and reindexing, but nothing works with hybrid search enabled. The only change since it was working as intended was an update to the OpenWebUI.
Author
Owner

@rgaricano commented on GitHub (Jul 30, 2025):

I have the same issue. I use Ollama models for reranking, and RAG only works if I disable hybrid search. I have tried swapping out models, and reindexing, but nothing works with hybrid search enabled. The only change since it was working as intended was an update to the OpenWebUI.

ollama, although they are working on it, currently it does not support rerank, the external reranker is not the model, it is the process to obtain an rerankered dict list, the model is to indicate the model to use.

@rgaricano commented on GitHub (Jul 30, 2025): > I have the same issue. I use Ollama models for reranking, and RAG only works if I disable hybrid search. I have tried swapping out models, and reindexing, but nothing works with hybrid search enabled. The only change since it was working as intended was an update to the OpenWebUI. ollama, although they are working on it, currently it does not support rerank, the external reranker is not the model, it is the process to obtain an rerankered dict list, the model is to indicate the model to use.
Author
Owner

@onestardao commented on GitHub (Aug 1, 2025):

@dotmobo

hey just saw this —
love that you’re pushing for hybrid retrieval (vector + fulltext + metadata).
but let me point out the silent killer that keeps wrecking these setups even when everything "looks fine" on paper:

!!! what actually breaks:
You do hit the right file,

you do run fulltext,

you do combine metadata...
...but the retrieved chunk drifts semantically from what the model thinks it retrieved.
→ So now your LLM is confidently hallucinating based on a chunk that’s technically “relevant” but logically off.

!!!! this maps exactly to what I call:
No.1 — semantic boundary drift

No.5 — cosine says “yes”, logic says “hell no”

No.2 — downstream collapse due to context misalignment
You might not see the problem until the reasoning step gives you a subtle wrong answer that nobody catches.

i got tired of fixing this again and again across projects —
so I ended up building a full failure map + open-sourced all the logic behind it:
👉 WFGY ProblemMap (MIT license)

it’s not a product. it’s a diagnosis system.
you don’t have to use it — just steal what’s useful.

also, not to flex but...
!!!! the author of Tesseract.js starred the project himself.
(yes, that one. the OCR guy. you can see WFGY on top1 now)

so if you’ve ever thought

“this RAG setup should work, but somehow it keeps subtly screwing up”
you’re not crazy. it’s a thing.
and I’ve mapped out exactly where and why it breaks.

if you’re curious, hit me up. otherwise, all yours — wild MIT license, no tracking, no weirdness.
we just want RAG to stop hallucinating, right?

@onestardao commented on GitHub (Aug 1, 2025): @dotmobo hey just saw this — love that you’re pushing for hybrid retrieval (vector + fulltext + metadata). but let me point out the silent killer that keeps wrecking these setups even when everything "looks fine" on paper: !!! what actually breaks: You do hit the right file, you do run fulltext, you do combine metadata... ...but the retrieved chunk drifts semantically from what the model thinks it retrieved. → So now your LLM is confidently hallucinating based on a chunk that’s technically “relevant” but logically off. !!!! this maps exactly to what I call: No.1 — semantic boundary drift No.5 — cosine says “yes”, logic says “hell no” No.2 — downstream collapse due to context misalignment You might not see the problem until the reasoning step gives you a subtle wrong answer that nobody catches. i got tired of fixing this again and again across projects — so I ended up building a full failure map + open-sourced all the logic behind it: 👉 [WFGY ProblemMap (MIT license)](https://github.com/onestardao/WFGY/blob/main/ProblemMap/README.md) it’s not a product. it’s a diagnosis system. you don’t have to use it — just steal what’s useful. also, not to flex but... !!!! the author of [Tesseract.js](https://github.com/bijection?tab=stars) starred the project himself. (yes, that one. the OCR guy. you can see WFGY on top1 now) so if you’ve ever thought “this RAG setup should work, but somehow it keeps subtly screwing up” you’re not crazy. it’s a thing. and I’ve mapped out exactly where and why it breaks. if you’re curious, hit me up. otherwise, all yours — wild MIT license, no tracking, no weirdness. we just want RAG to stop hallucinating, right?
Author
Owner

@le-patenteux commented on GitHub (Aug 1, 2025):

Same issue for me. With local Ollama backend for embedding/retrieval

Reindexing : Does not work
Switching-off hybrid search: Works
Reverting to v0.6.15: Everything works fine again.

The issue is clearly with retrieval. on 0.6.16 +

Setup:
Default open-webui vector database (chromaDB I think?)
Ollama server is reachable, has no issue
Embed: bge-m3 (tried multiple)
Reranker: linux6200/bge-reranker-v2-m3

@le-patenteux commented on GitHub (Aug 1, 2025): Same issue for me. With local Ollama backend for embedding/retrieval Reindexing : Does not work Switching-off hybrid search: Works Reverting to v0.6.15: Everything works fine again. The issue is clearly with retrieval. on 0.6.16 + Setup: Default open-webui vector database (chromaDB I think?) Ollama server is reachable, has no issue Embed: bge-m3 (tried multiple) Reranker: linux6200/bge-reranker-v2-m3
Author
Owner

@le-patenteux commented on GitHub (Aug 11, 2025):

I just came back from vacations and updated from 0.6.15 to 0.6.21 (a fix for hybrid search was added in 0.6.19 if I am reading correctly)
Hybrid search with Ollama backend still broken for me:

Image

See previous post for more details on my setup. I confirm it is running on chromaDB

@le-patenteux commented on GitHub (Aug 11, 2025): I just came back from vacations and updated from 0.6.15 to 0.6.21 (a fix for hybrid search was added in 0.6.19 if I am reading correctly) Hybrid search with Ollama backend still broken for me: <img width="1671" height="1271" alt="Image" src="https://github.com/user-attachments/assets/5e7ada41-aca2-4319-8284-85726131ebd5" /> See previous post for more details on my setup. I confirm it is running on chromaDB
Author
Owner

@onestardao commented on GitHub (Aug 11, 2025):

yo @patentsaur — just a quick heads-up that might save you a ton of time.

you can grab our TXTOS pack (MIT license) and ask your AI directly:

“Use the WFGY formulas in this file to fix hybrid retrieval hallucination in my setup”

the math will kick in on the semantic layer — no infra changes needed.
think of it like a semantic firewall: the fix runs before the model ever queries or ranks.

lot of folks who tried this were surprised how simple it was
just drop in, ask the right way, and suddenly the model stops doing dumb things.

also, this catches not just current issues but a few nasty bugs you'll likely hit later (like pre-deploy drift or index mismatch).
worth a shot if you’re stuck in that “everything looks fine but still breaks” loop ^____^

Problem Map again
https://github.com/onestardao/WFGY/blob/main/ProblemMap/README.md

@onestardao commented on GitHub (Aug 11, 2025): yo @patentsaur — just a quick heads-up that might save you a ton of time. you can grab our TXTOS pack (MIT license) and ask your AI directly: > “Use the WFGY formulas in this file to fix hybrid retrieval hallucination in my setup” the math will kick in on the semantic layer — no infra changes needed. think of it like a **semantic firewall**: the fix runs *before* the model ever queries or ranks. lot of folks who tried this were surprised how simple it was just drop in, ask the right way, and suddenly the model stops doing dumb things. also, this catches not just current issues but a few nasty bugs you'll likely hit later (like pre-deploy drift or index mismatch). worth a shot if you’re stuck in that “everything looks fine but still breaks” loop ^____^ Problem Map again https://github.com/onestardao/WFGY/blob/main/ProblemMap/README.md
Author
Owner

@sthemeow commented on GitHub (Aug 12, 2025):

Same issue here. Happens with 0.6.18 and 0.6.22. Downgraded to 0.6.15 and it works fine

@sthemeow commented on GitHub (Aug 12, 2025): Same issue here. Happens with 0.6.18 and 0.6.22. Downgraded to 0.6.15 and it works fine
Author
Owner

@sbutler2901 commented on GitHub (Aug 16, 2025):

Same for me on v0.6.22

2025-08-16 00:41:43.297 | INFO     | open_webui.routers.openai:get_all_models:397 - get_all_models()
2025-08-16 00:41:54.460 | INFO     | open_webui.routers.openai:get_all_models:397 - get_all_models()
2025-08-16 00:42:02.027 | INFO     | open_webui.retrieval.utils:query_collection_with_hybrid_search:352 - Starting hybrid search for 3 queries in 1 collections...
2025-08-16 00:42:07.602 | WARNING  | chromadb.segment.impl.vector.local_persistent_hnsw:query_vectors:423 - Number of requested results 10 is greater than number of elements in index 1, updating n_results = 1
2025-08-16 00:42:07.604 | INFO     | open_webui.retrieval.models.external:predict:40 - ExternalReranker:predict:model klnstpr/bge-reranker-v2-m3
2025-08-16 00:42:07.604 | INFO     | open_webui.retrieval.models.external:predict:41 - ExternalReranker:predict:query scope of support in software documentation
2025-08-16 00:42:07.652 | WARNING  | chromadb.segment.impl.vector.local_persistent_hnsw:query_vectors:423 - Number of requested results 10 is greater than number of elements in index 1, updating n_results = 1
2025-08-16 00:42:07.654 | INFO     | open_webui.retrieval.models.external:predict:40 - ExternalReranker:predict:model klnstpr/bge-reranker-v2-m3
2025-08-16 00:42:07.654 | INFO     | open_webui.retrieval.models.external:predict:41 - ExternalReranker:predict:query support policy guidelines for open-source projects
2025-08-16 00:42:07.694 | WARNING  | chromadb.segment.impl.vector.local_persistent_hnsw:query_vectors:423 - Number of requested results 10 is greater than number of elements in index 1, updating n_results = 1
2025-08-16 00:42:07.696 | INFO     | open_webui.retrieval.models.external:predict:40 - ExternalReranker:predict:model klnstpr/bge-reranker-v2-m3
2025-08-16 00:42:07.696 | INFO     | open_webui.retrieval.models.external:predict:41 - ExternalReranker:predict:query defining support boundaries in technical documentation
2025-08-16 00:42:13.149 | ERROR    | open_webui.retrieval.models.external:predict:73 - Error in external reranking: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank
Traceback (most recent call last):

  File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x7ff45b6349a0>
    └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
  File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x7ff45b634680>
    └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
  File "/usr/local/lib/python3.11/threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
    │    │        │    └ (<weakref at 0x7ff415548ef0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,...
    │    │        └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
    │    └ <function _worker at 0x7ff45a7109a0>
    └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
    work_item.run()
    │         └ <function _WorkItem.run at 0x7ff45a710ae0>
    └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             │    │   │    │       │    └ {}
             │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>
             │    │   │    └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'scope of support in software documentation')
             │    │   └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>
             │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080>
             └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>

  File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
    result = query_doc_with_hybrid_search(
             └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0>

  File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search
    result = compression_retriever.invoke(query)
             │                     │      └ 'scope of support in software documentation'
             │                     └ <function BaseRetriever.invoke at 0x7ff4207b1c60>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke
    result = self._get_relevant_documents(
             │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
  File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents
    compressed_docs = self.base_compressor.compress_documents(
                      │    │               └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20>
                      │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...
                      └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/app/backend/open_webui/retrieval/utils.py", line 950, in compress_documents
    scores = self.reranking_function(
             │    └ <function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41692e3e0>
             └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...

  File "/app/backend/open_webui/utils/middleware.py", line 659, in <lambda>
    lambda sentences: request.app.state.RERANKING_FUNCTION(
           │          │       └ <property object at 0x7ff457c5de90>
           │          └ <starlette.requests.Request object at 0x7ff4155558d0>
           └ [('scope of support in software documentation', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport { TopBan...

  File "/app/backend/open_webui/retrieval/utils.py", line 452, in <lambda>
    return lambda sentences, user=None: reranking_function.predict(
                  │                     │                  └ <function ExternalReranker.predict at 0x7ff418dc93a0>
                  │                     └ <open_webui.retrieval.models.external.ExternalReranker object at 0x7ff417175690>
                  └ [('scope of support in software documentation', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport { TopBan...

> File "/app/backend/open_webui/retrieval/models/external.py", line 62, in predict
    r.raise_for_status()
    │ └ <function Response.raise_for_status at 0x7ff457e4d300>
    └ <Response [500]>

  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
          │         │                        └ <Response [500]>
          │         └ '500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank'
          └ <class 'requests.exceptions.HTTPError'>

requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank
2025-08-16 00:42:13.150 | ERROR    | open_webui.retrieval.utils:query_doc_with_hybrid_search:191 - Error querying doc file-838f86ce-7522-4b97-a9c1-affbd9a608e2 with hybrid search: 'NoneType' object has no attribute 'tolist'
Traceback (most recent call last):

  File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x7ff45b6349a0>
    └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
  File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x7ff45b634680>
    └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
  File "/usr/local/lib/python3.11/threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
    │    │        │    └ (<weakref at 0x7ff415548ef0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,...
    │    │        └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
    │    └ <function _worker at 0x7ff45a7109a0>
    └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
    work_item.run()
    │         └ <function _WorkItem.run at 0x7ff45a710ae0>
    └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             │    │   │    │       │    └ {}
             │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>
             │    │   │    └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'scope of support in software documentation')
             │    │   └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>
             │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080>
             └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>

  File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
    result = query_doc_with_hybrid_search(
             └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0>

> File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search
    result = compression_retriever.invoke(query)
             │                     │      └ 'scope of support in software documentation'
             │                     └ <function BaseRetriever.invoke at 0x7ff4207b1c60>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke
    result = self._get_relevant_documents(
             │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
  File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents
    compressed_docs = self.base_compressor.compress_documents(
                      │    │               └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20>
                      │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...
                      └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents
    zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
        │          │                                 │                  └ None
        │          │                                 └ None
        │          └ None
        └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e...

AttributeError: 'NoneType' object has no attribute 'tolist'
2025-08-16 00:42:13.151 | ERROR    | open_webui.retrieval.utils:process_query:371 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist'
Traceback (most recent call last):

  File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x7ff45b6349a0>
    └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
  File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x7ff45b634680>
    └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
  File "/usr/local/lib/python3.11/threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
    │    │        │    └ (<weakref at 0x7ff415548ef0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,...
    │    │        └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
    │    └ <function _worker at 0x7ff45a7109a0>
    └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
    work_item.run()
    │         └ <function _WorkItem.run at 0x7ff45a710ae0>
    └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             │    │   │    │       │    └ {}
             │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>
             │    │   │    └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'scope of support in software documentation')
             │    │   └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>
             │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080>
             └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090>

> File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
    result = query_doc_with_hybrid_search(
             └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0>

  File "/app/backend/open_webui/retrieval/utils.py", line 192, in query_doc_with_hybrid_search
    raise e

  File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search
    result = compression_retriever.invoke(query)
             │                     │      └ 'scope of support in software documentation'
             │                     └ <function BaseRetriever.invoke at 0x7ff4207b1c60>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke
    result = self._get_relevant_documents(
             │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
  File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents
    compressed_docs = self.base_compressor.compress_documents(
                      │    │               └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20>
                      │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...
                      └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents
    zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
        │          │                                 │                  └ None
        │          │                                 └ None
        │          └ None
        └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e...

AttributeError: 'NoneType' object has no attribute 'tolist'
2025-08-16 00:42:13.152 | ERROR    | open_webui.retrieval.models.external:predict:73 - Error in external reranking: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank
Traceback (most recent call last):

  File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x7ff45b6349a0>
    └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
  File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x7ff45b634680>
    └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
  File "/usr/local/lib/python3.11/threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
    │    │        │    └ (<weakref at 0x7ff4155488b0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,...
    │    │        └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
    │    └ <function _worker at 0x7ff45a7109a0>
    └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
    work_item.run()
    │         └ <function _WorkItem.run at 0x7ff45a710ae0>
    └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             │    │   │    │       │    └ {}
             │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>
             │    │   │    └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'defining support boundaries in technical documentation')
             │    │   └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>
             │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080>
             └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>

  File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
    result = query_doc_with_hybrid_search(
             └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0>

  File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search
    result = compression_retriever.invoke(query)
             │                     │      └ 'defining support boundaries in technical documentation'
             │                     └ <function BaseRetriever.invoke at 0x7ff4207b1c60>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke
    result = self._get_relevant_documents(
             │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
  File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents
    compressed_docs = self.base_compressor.compress_documents(
                      │    │               └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20>
                      │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...
                      └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/app/backend/open_webui/retrieval/utils.py", line 950, in compress_documents
    scores = self.reranking_function(
             │    └ <function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41692e3e0>
             └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...

  File "/app/backend/open_webui/utils/middleware.py", line 659, in <lambda>
    lambda sentences: request.app.state.RERANKING_FUNCTION(
           │          │       └ <property object at 0x7ff457c5de90>
           │          └ <starlette.requests.Request object at 0x7ff4155558d0>
           └ [('defining support boundaries in technical documentation', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimp...

  File "/app/backend/open_webui/retrieval/utils.py", line 452, in <lambda>
    return lambda sentences, user=None: reranking_function.predict(
                  │                     │                  └ <function ExternalReranker.predict at 0x7ff418dc93a0>
                  │                     └ <open_webui.retrieval.models.external.ExternalReranker object at 0x7ff417175690>
                  └ [('defining support boundaries in technical documentation', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimp...

> File "/app/backend/open_webui/retrieval/models/external.py", line 62, in predict
    r.raise_for_status()
    │ └ <function Response.raise_for_status at 0x7ff457e4d300>
    └ <Response [500]>

  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
          │         │                        └ <Response [500]>
          │         └ '500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank'
          └ <class 'requests.exceptions.HTTPError'>

requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank
2025-08-16 00:42:13.154 | ERROR    | open_webui.retrieval.utils:query_doc_with_hybrid_search:191 - Error querying doc file-838f86ce-7522-4b97-a9c1-affbd9a608e2 with hybrid search: 'NoneType' object has no attribute 'tolist'
Traceback (most recent call last):

  File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x7ff45b6349a0>
    └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
  File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x7ff45b634680>
    └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
  File "/usr/local/lib/python3.11/threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
    │    │        │    └ (<weakref at 0x7ff4155488b0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,...
    │    │        └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
    │    └ <function _worker at 0x7ff45a7109a0>
    └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
    work_item.run()
    │         └ <function _WorkItem.run at 0x7ff45a710ae0>
    └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             │    │   │    │       │    └ {}
             │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>
             │    │   │    └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'defining support boundaries in technical documentation')
             │    │   └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>
             │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080>
             └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>

  File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
    result = query_doc_with_hybrid_search(
             └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0>

> File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search
    result = compression_retriever.invoke(query)
             │                     │      └ 'defining support boundaries in technical documentation'
             │                     └ <function BaseRetriever.invoke at 0x7ff4207b1c60>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke
    result = self._get_relevant_documents(
             │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
  File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents
    compressed_docs = self.base_compressor.compress_documents(
                      │    │               └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20>
                      │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...
                      └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents
    zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
        │          │                                 │                  └ None
        │          │                                 └ None
        │          └ None
        └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e...

AttributeError: 'NoneType' object has no attribute 'tolist'
2025-08-16 00:42:13.155 | ERROR    | open_webui.retrieval.utils:process_query:371 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist'
Traceback (most recent call last):

  File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x7ff45b6349a0>
    └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
  File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x7ff45b634680>
    └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
  File "/usr/local/lib/python3.11/threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
    │    │        │    └ (<weakref at 0x7ff4155488b0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,...
    │    │        └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
    │    └ <function _worker at 0x7ff45a7109a0>
    └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
    work_item.run()
    │         └ <function _WorkItem.run at 0x7ff45a710ae0>
    └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             │    │   │    │       │    └ {}
             │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>
             │    │   │    └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'defining support boundaries in technical documentation')
             │    │   └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>
             │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080>
             └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610>

> File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
    result = query_doc_with_hybrid_search(
             └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0>

  File "/app/backend/open_webui/retrieval/utils.py", line 192, in query_doc_with_hybrid_search
    raise e

  File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search
    result = compression_retriever.invoke(query)
             │                     │      └ 'defining support boundaries in technical documentation'
             │                     └ <function BaseRetriever.invoke at 0x7ff4207b1c60>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke
    result = self._get_relevant_documents(
             │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
  File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents
    compressed_docs = self.base_compressor.compress_documents(
                      │    │               └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20>
                      │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...
                      └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents
    zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
        │          │                                 │                  └ None
        │          │                                 └ None
        │          └ None
        └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e...

AttributeError: 'NoneType' object has no attribute 'tolist'
2025-08-16 00:42:13.156 | ERROR    | open_webui.retrieval.models.external:predict:73 - Error in external reranking: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank
Traceback (most recent call last):

  File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x7ff45b6349a0>
    └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
  File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x7ff45b634680>
    └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
  File "/usr/local/lib/python3.11/threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
    │    │        │    └ (<weakref at 0x7ff415548bd0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,...
    │    │        └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
    │    └ <function _worker at 0x7ff45a7109a0>
    └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
    work_item.run()
    │         └ <function _WorkItem.run at 0x7ff45a710ae0>
    └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             │    │   │    │       │    └ {}
             │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>
             │    │   │    └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'support policy guidelines for open-source projects')
             │    │   └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>
             │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080>
             └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>

  File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
    result = query_doc_with_hybrid_search(
             └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0>

  File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search
    result = compression_retriever.invoke(query)
             │                     │      └ 'support policy guidelines for open-source projects'
             │                     └ <function BaseRetriever.invoke at 0x7ff4207b1c60>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke
    result = self._get_relevant_documents(
             │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
  File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents
    compressed_docs = self.base_compressor.compress_documents(
                      │    │               └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20>
                      │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...
                      └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/app/backend/open_webui/retrieval/utils.py", line 950, in compress_documents
    scores = self.reranking_function(
             │    └ <function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41692e3e0>
             └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...

  File "/app/backend/open_webui/utils/middleware.py", line 659, in <lambda>
    lambda sentences: request.app.state.RERANKING_FUNCTION(
           │          │       └ <property object at 0x7ff457c5de90>
           │          └ <starlette.requests.Request object at 0x7ff4155558d0>
           └ [('support policy guidelines for open-source projects', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport ...

  File "/app/backend/open_webui/retrieval/utils.py", line 452, in <lambda>
    return lambda sentences, user=None: reranking_function.predict(
                  │                     │                  └ <function ExternalReranker.predict at 0x7ff418dc93a0>
                  │                     └ <open_webui.retrieval.models.external.ExternalReranker object at 0x7ff417175690>
                  └ [('support policy guidelines for open-source projects', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport ...

> File "/app/backend/open_webui/retrieval/models/external.py", line 62, in predict
    r.raise_for_status()
    │ └ <function Response.raise_for_status at 0x7ff457e4d300>
    └ <Response [500]>

  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
          │         │                        └ <Response [500]>
          │         └ '500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank'
          └ <class 'requests.exceptions.HTTPError'>

requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank
2025-08-16 00:42:13.157 | ERROR    | open_webui.retrieval.utils:query_doc_with_hybrid_search:191 - Error querying doc file-838f86ce-7522-4b97-a9c1-affbd9a608e2 with hybrid search: 'NoneType' object has no attribute 'tolist'
Traceback (most recent call last):

  File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x7ff45b6349a0>
    └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
  File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x7ff45b634680>
    └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
  File "/usr/local/lib/python3.11/threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
    │    │        │    └ (<weakref at 0x7ff415548bd0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,...
    │    │        └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
    │    └ <function _worker at 0x7ff45a7109a0>
    └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
    work_item.run()
    │         └ <function _WorkItem.run at 0x7ff45a710ae0>
    └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             │    │   │    │       │    └ {}
             │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>
             │    │   │    └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'support policy guidelines for open-source projects')
             │    │   └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>
             │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080>
             └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>

  File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
    result = query_doc_with_hybrid_search(
             └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0>

> File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search
    result = compression_retriever.invoke(query)
             │                     │      └ 'support policy guidelines for open-source projects'
             │                     └ <function BaseRetriever.invoke at 0x7ff4207b1c60>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke
    result = self._get_relevant_documents(
             │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
  File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents
    compressed_docs = self.base_compressor.compress_documents(
                      │    │               └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20>
                      │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...
                      └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents
    zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
        │          │                                 │                  └ None
        │          │                                 └ None
        │          └ None
        └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e...

AttributeError: 'NoneType' object has no attribute 'tolist'
2025-08-16 00:42:13.159 | ERROR    | open_webui.retrieval.utils:process_query:371 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist'
Traceback (most recent call last):

  File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x7ff45b6349a0>
    └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
  File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x7ff45b634680>
    └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
  File "/usr/local/lib/python3.11/threading.py", line 982, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
    │    │        │    └ (<weakref at 0x7ff415548bd0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,...
    │    │        └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
    │    └ <function _worker at 0x7ff45a7109a0>
    └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker
    work_item.run()
    │         └ <function _WorkItem.run at 0x7ff45a710ae0>
    └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>
  File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             │    │   │    │       │    └ {}
             │    │   │    │       └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>
             │    │   │    └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'support policy guidelines for open-source projects')
             │    │   └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>
             │    └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080>
             └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450>

> File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
    result = query_doc_with_hybrid_search(
             └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0>

  File "/app/backend/open_webui/retrieval/utils.py", line 192, in query_doc_with_hybrid_search
    raise e

  File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search
    result = compression_retriever.invoke(query)
             │                     │      └ 'support policy guidelines for open-source projects'
             │                     └ <function BaseRetriever.invoke at 0x7ff4207b1c60>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke
    result = self._get_relevant_documents(
             │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
  File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents
    compressed_docs = self.base_compressor.compress_documents(
                      │    │               └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20>
                      │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693...
                      └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents
    zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
        │          │                                 │                  └ None
        │          │                                 └ None
        │          └ None
        └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e...

AttributeError: 'NoneType' object has no attribute 'tolist'
2025-08-16 00:42:13.161 | INFO     | open_webui.routers.openai:get_all_models:397 - get_all_models()

Similar to: https://github.com/open-webui/open-webui/issues/16228#issuecomment-3146579709:

sudo docker exec -it open-webui curl http://127.0.0.1:10005/v1/rerank \
  -H "Content-Type: application/json" \
  -d '{
  "model": "klnstpr/bge-reranker-v2-m3",
  "query": "Organic skincare products for sensitive skin",
  "documents": [
    "Eco-friendly kitchenware for modern homes",
    "Biodegradable cleaning supplies for eco-conscious consumers",
    "Organic cotton baby clothes for sensitive skin",
    "Natural organic skincare range for sensitive skin",
    "Tech gadgets for smart homes: 2024 edition",
    "Sustainable gardening tools and compost solutions",
    "Sensitive skin-friendly facial cleansers and toners",
    "Organic food wraps and storage solutions",
    "All-natural pet food for dogs with allergies",
    "Yoga mats made from recycled materials"
  ],
  "top_n": 3
}'
{"model":"klnstpr/bge-reranker-v2-m3","object":"list","usage":{"prompt_tokens":195,"total_tokens":195},"results":[{"index":0,"relevance_score":-10.885379791259766},{"index":1,"relevance_score":-8.234384536743164},{"index":2,"relevance_score":-0.7093545794487},{"index":3,"relevance_score":7.277642250061035},{"index":4,"relevance_score":-11.046594619750977},{"index":5,"relevance_score":-11.040145874023438},{"index":6,"relevance_score":0.6702241897583008},{"index":7,"relevance_score":-7.417957782745361},{"index":8,"relevance_score":-9.351853370666504},{"index":9,"relevance_score":-10.990015029907227}]}%
@sbutler2901 commented on GitHub (Aug 16, 2025): Same for me on v0.6.22 ``` 2025-08-16 00:41:43.297 | INFO | open_webui.routers.openai:get_all_models:397 - get_all_models() 2025-08-16 00:41:54.460 | INFO | open_webui.routers.openai:get_all_models:397 - get_all_models() 2025-08-16 00:42:02.027 | INFO | open_webui.retrieval.utils:query_collection_with_hybrid_search:352 - Starting hybrid search for 3 queries in 1 collections... 2025-08-16 00:42:07.602 | WARNING | chromadb.segment.impl.vector.local_persistent_hnsw:query_vectors:423 - Number of requested results 10 is greater than number of elements in index 1, updating n_results = 1 2025-08-16 00:42:07.604 | INFO | open_webui.retrieval.models.external:predict:40 - ExternalReranker:predict:model klnstpr/bge-reranker-v2-m3 2025-08-16 00:42:07.604 | INFO | open_webui.retrieval.models.external:predict:41 - ExternalReranker:predict:query scope of support in software documentation 2025-08-16 00:42:07.652 | WARNING | chromadb.segment.impl.vector.local_persistent_hnsw:query_vectors:423 - Number of requested results 10 is greater than number of elements in index 1, updating n_results = 1 2025-08-16 00:42:07.654 | INFO | open_webui.retrieval.models.external:predict:40 - ExternalReranker:predict:model klnstpr/bge-reranker-v2-m3 2025-08-16 00:42:07.654 | INFO | open_webui.retrieval.models.external:predict:41 - ExternalReranker:predict:query support policy guidelines for open-source projects 2025-08-16 00:42:07.694 | WARNING | chromadb.segment.impl.vector.local_persistent_hnsw:query_vectors:423 - Number of requested results 10 is greater than number of elements in index 1, updating n_results = 1 2025-08-16 00:42:07.696 | INFO | open_webui.retrieval.models.external:predict:40 - ExternalReranker:predict:model klnstpr/bge-reranker-v2-m3 2025-08-16 00:42:07.696 | INFO | open_webui.retrieval.models.external:predict:41 - ExternalReranker:predict:query defining support boundaries in technical documentation 2025-08-16 00:42:13.149 | ERROR | open_webui.retrieval.models.external:predict:73 - Error in external reranking: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank Traceback (most recent call last): File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() │ └ <function Thread._bootstrap_inner at 0x7ff45b6349a0> └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() │ └ <function Thread.run at 0x7ff45b634680> └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> File "/usr/local/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> │ │ │ └ (<weakref at 0x7ff415548ef0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,... │ │ └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> │ └ <function _worker at 0x7ff45a7109a0> └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() │ └ <function _WorkItem.run at 0x7ff45a710ae0> └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> │ │ │ └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'scope of support in software documentation') │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080> └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query result = query_doc_with_hybrid_search( └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0> File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search result = compression_retriever.invoke(query) │ │ └ 'scope of support in software documentation' │ └ <function BaseRetriever.invoke at 0x7ff4207b1c60> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke result = self._get_relevant_documents( │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( │ │ └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20> │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/app/backend/open_webui/retrieval/utils.py", line 950, in compress_documents scores = self.reranking_function( │ └ <function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41692e3e0> └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... File "/app/backend/open_webui/utils/middleware.py", line 659, in <lambda> lambda sentences: request.app.state.RERANKING_FUNCTION( │ │ └ <property object at 0x7ff457c5de90> │ └ <starlette.requests.Request object at 0x7ff4155558d0> └ [('scope of support in software documentation', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport { TopBan... File "/app/backend/open_webui/retrieval/utils.py", line 452, in <lambda> return lambda sentences, user=None: reranking_function.predict( │ │ └ <function ExternalReranker.predict at 0x7ff418dc93a0> │ └ <open_webui.retrieval.models.external.ExternalReranker object at 0x7ff417175690> └ [('scope of support in software documentation', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport { TopBan... > File "/app/backend/open_webui/retrieval/models/external.py", line 62, in predict r.raise_for_status() │ └ <function Response.raise_for_status at 0x7ff457e4d300> └ <Response [500]> File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status raise HTTPError(http_error_msg, response=self) │ │ └ <Response [500]> │ └ '500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank' └ <class 'requests.exceptions.HTTPError'> requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank 2025-08-16 00:42:13.150 | ERROR | open_webui.retrieval.utils:query_doc_with_hybrid_search:191 - Error querying doc file-838f86ce-7522-4b97-a9c1-affbd9a608e2 with hybrid search: 'NoneType' object has no attribute 'tolist' Traceback (most recent call last): File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() │ └ <function Thread._bootstrap_inner at 0x7ff45b6349a0> └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() │ └ <function Thread.run at 0x7ff45b634680> └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> File "/usr/local/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> │ │ │ └ (<weakref at 0x7ff415548ef0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,... │ │ └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> │ └ <function _worker at 0x7ff45a7109a0> └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() │ └ <function _WorkItem.run at 0x7ff45a710ae0> └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> │ │ │ └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'scope of support in software documentation') │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080> └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query result = query_doc_with_hybrid_search( └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0> > File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search result = compression_retriever.invoke(query) │ │ └ 'scope of support in software documentation' │ └ <function BaseRetriever.invoke at 0x7ff4207b1c60> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke result = self._get_relevant_documents( │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( │ │ └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20> │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents zip(documents, scores.tolist() if not isinstance(scores, list) else scores) │ │ │ └ None │ │ └ None │ └ None └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e... AttributeError: 'NoneType' object has no attribute 'tolist' 2025-08-16 00:42:13.151 | ERROR | open_webui.retrieval.utils:process_query:371 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist' Traceback (most recent call last): File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() │ └ <function Thread._bootstrap_inner at 0x7ff45b6349a0> └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() │ └ <function Thread.run at 0x7ff45b634680> └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> File "/usr/local/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> │ │ │ └ (<weakref at 0x7ff415548ef0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,... │ │ └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> │ └ <function _worker at 0x7ff45a7109a0> └ <Thread(ThreadPoolExecutor-9_0, started 140686265398976)> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() │ └ <function _WorkItem.run at 0x7ff45a710ae0> └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> │ │ │ └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'scope of support in software documentation') │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080> └ <concurrent.futures.thread._WorkItem object at 0x7ff41651f090> > File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query result = query_doc_with_hybrid_search( └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0> File "/app/backend/open_webui/retrieval/utils.py", line 192, in query_doc_with_hybrid_search raise e File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search result = compression_retriever.invoke(query) │ │ └ 'scope of support in software documentation' │ └ <function BaseRetriever.invoke at 0x7ff4207b1c60> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke result = self._get_relevant_documents( │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( │ │ └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20> │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents zip(documents, scores.tolist() if not isinstance(scores, list) else scores) │ │ │ └ None │ │ └ None │ └ None └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e... AttributeError: 'NoneType' object has no attribute 'tolist' 2025-08-16 00:42:13.152 | ERROR | open_webui.retrieval.models.external:predict:73 - Error in external reranking: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank Traceback (most recent call last): File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() │ └ <function Thread._bootstrap_inner at 0x7ff45b6349a0> └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() │ └ <function Thread.run at 0x7ff45b634680> └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> File "/usr/local/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> │ │ │ └ (<weakref at 0x7ff4155488b0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,... │ │ └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> │ └ <function _worker at 0x7ff45a7109a0> └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() │ └ <function _WorkItem.run at 0x7ff45a710ae0> └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> │ │ │ └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'defining support boundaries in technical documentation') │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080> └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query result = query_doc_with_hybrid_search( └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0> File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search result = compression_retriever.invoke(query) │ │ └ 'defining support boundaries in technical documentation' │ └ <function BaseRetriever.invoke at 0x7ff4207b1c60> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke result = self._get_relevant_documents( │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( │ │ └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20> │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/app/backend/open_webui/retrieval/utils.py", line 950, in compress_documents scores = self.reranking_function( │ └ <function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41692e3e0> └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... File "/app/backend/open_webui/utils/middleware.py", line 659, in <lambda> lambda sentences: request.app.state.RERANKING_FUNCTION( │ │ └ <property object at 0x7ff457c5de90> │ └ <starlette.requests.Request object at 0x7ff4155558d0> └ [('defining support boundaries in technical documentation', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimp... File "/app/backend/open_webui/retrieval/utils.py", line 452, in <lambda> return lambda sentences, user=None: reranking_function.predict( │ │ └ <function ExternalReranker.predict at 0x7ff418dc93a0> │ └ <open_webui.retrieval.models.external.ExternalReranker object at 0x7ff417175690> └ [('defining support boundaries in technical documentation', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimp... > File "/app/backend/open_webui/retrieval/models/external.py", line 62, in predict r.raise_for_status() │ └ <function Response.raise_for_status at 0x7ff457e4d300> └ <Response [500]> File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status raise HTTPError(http_error_msg, response=self) │ │ └ <Response [500]> │ └ '500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank' └ <class 'requests.exceptions.HTTPError'> requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank 2025-08-16 00:42:13.154 | ERROR | open_webui.retrieval.utils:query_doc_with_hybrid_search:191 - Error querying doc file-838f86ce-7522-4b97-a9c1-affbd9a608e2 with hybrid search: 'NoneType' object has no attribute 'tolist' Traceback (most recent call last): File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() │ └ <function Thread._bootstrap_inner at 0x7ff45b6349a0> └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() │ └ <function Thread.run at 0x7ff45b634680> └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> File "/usr/local/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> │ │ │ └ (<weakref at 0x7ff4155488b0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,... │ │ └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> │ └ <function _worker at 0x7ff45a7109a0> └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() │ └ <function _WorkItem.run at 0x7ff45a710ae0> └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> │ │ │ └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'defining support boundaries in technical documentation') │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080> └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query result = query_doc_with_hybrid_search( └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0> > File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search result = compression_retriever.invoke(query) │ │ └ 'defining support boundaries in technical documentation' │ └ <function BaseRetriever.invoke at 0x7ff4207b1c60> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke result = self._get_relevant_documents( │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( │ │ └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20> │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents zip(documents, scores.tolist() if not isinstance(scores, list) else scores) │ │ │ └ None │ │ └ None │ └ None └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e... AttributeError: 'NoneType' object has no attribute 'tolist' 2025-08-16 00:42:13.155 | ERROR | open_webui.retrieval.utils:process_query:371 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist' Traceback (most recent call last): File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() │ └ <function Thread._bootstrap_inner at 0x7ff45b6349a0> └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() │ └ <function Thread.run at 0x7ff45b634680> └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> File "/usr/local/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> │ │ │ └ (<weakref at 0x7ff4155488b0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,... │ │ └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> │ └ <function _worker at 0x7ff45a7109a0> └ <Thread(ThreadPoolExecutor-9_2, started 140685244602048)> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() │ └ <function _WorkItem.run at 0x7ff45a710ae0> └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> │ │ │ └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'defining support boundaries in technical documentation') │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080> └ <concurrent.futures.thread._WorkItem object at 0x7ff41557b610> > File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query result = query_doc_with_hybrid_search( └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0> File "/app/backend/open_webui/retrieval/utils.py", line 192, in query_doc_with_hybrid_search raise e File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search result = compression_retriever.invoke(query) │ │ └ 'defining support boundaries in technical documentation' │ └ <function BaseRetriever.invoke at 0x7ff4207b1c60> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke result = self._get_relevant_documents( │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( │ │ └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20> │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents zip(documents, scores.tolist() if not isinstance(scores, list) else scores) │ │ │ └ None │ │ └ None │ └ None └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e... AttributeError: 'NoneType' object has no attribute 'tolist' 2025-08-16 00:42:13.156 | ERROR | open_webui.retrieval.models.external:predict:73 - Error in external reranking: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank Traceback (most recent call last): File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() │ └ <function Thread._bootstrap_inner at 0x7ff45b6349a0> └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() │ └ <function Thread.run at 0x7ff45b634680> └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> File "/usr/local/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> │ │ │ └ (<weakref at 0x7ff415548bd0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,... │ │ └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> │ └ <function _worker at 0x7ff45a7109a0> └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() │ └ <function _WorkItem.run at 0x7ff45a710ae0> └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> │ │ │ └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'support policy guidelines for open-source projects') │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080> └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query result = query_doc_with_hybrid_search( └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0> File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search result = compression_retriever.invoke(query) │ │ └ 'support policy guidelines for open-source projects' │ └ <function BaseRetriever.invoke at 0x7ff4207b1c60> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke result = self._get_relevant_documents( │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( │ │ └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20> │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/app/backend/open_webui/retrieval/utils.py", line 950, in compress_documents scores = self.reranking_function( │ └ <function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41692e3e0> └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... File "/app/backend/open_webui/utils/middleware.py", line 659, in <lambda> lambda sentences: request.app.state.RERANKING_FUNCTION( │ │ └ <property object at 0x7ff457c5de90> │ └ <starlette.requests.Request object at 0x7ff4155558d0> └ [('support policy guidelines for open-source projects', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport ... File "/app/backend/open_webui/retrieval/utils.py", line 452, in <lambda> return lambda sentences, user=None: reranking_function.predict( │ │ └ <function ExternalReranker.predict at 0x7ff418dc93a0> │ └ <open_webui.retrieval.models.external.ExternalReranker object at 0x7ff417175690> └ [('support policy guidelines for open-source projects', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport ... > File "/app/backend/open_webui/retrieval/models/external.py", line 62, in predict r.raise_for_status() │ └ <function Response.raise_for_status at 0x7ff457e4d300> └ <Response [500]> File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status raise HTTPError(http_error_msg, response=self) │ │ └ <Response [500]> │ └ '500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank' └ <class 'requests.exceptions.HTTPError'> requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank 2025-08-16 00:42:13.157 | ERROR | open_webui.retrieval.utils:query_doc_with_hybrid_search:191 - Error querying doc file-838f86ce-7522-4b97-a9c1-affbd9a608e2 with hybrid search: 'NoneType' object has no attribute 'tolist' Traceback (most recent call last): File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() │ └ <function Thread._bootstrap_inner at 0x7ff45b6349a0> └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() │ └ <function Thread.run at 0x7ff45b634680> └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> File "/usr/local/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> │ │ │ └ (<weakref at 0x7ff415548bd0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,... │ │ └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> │ └ <function _worker at 0x7ff45a7109a0> └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() │ └ <function _WorkItem.run at 0x7ff45a710ae0> └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> │ │ │ └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'support policy guidelines for open-source projects') │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080> └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query result = query_doc_with_hybrid_search( └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0> > File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search result = compression_retriever.invoke(query) │ │ └ 'support policy guidelines for open-source projects' │ └ <function BaseRetriever.invoke at 0x7ff4207b1c60> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke result = self._get_relevant_documents( │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( │ │ └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20> │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents zip(documents, scores.tolist() if not isinstance(scores, list) else scores) │ │ │ └ None │ │ └ None │ └ None └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e... AttributeError: 'NoneType' object has no attribute 'tolist' 2025-08-16 00:42:13.159 | ERROR | open_webui.retrieval.utils:process_query:371 - Error when querying the collection with hybrid_search: 'NoneType' object has no attribute 'tolist' Traceback (most recent call last): File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap self._bootstrap_inner() │ └ <function Thread._bootstrap_inner at 0x7ff45b6349a0> └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner self.run() │ └ <function Thread.run at 0x7ff45b634680> └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> File "/usr/local/lib/python3.11/threading.py", line 982, in run self._target(*self._args, **self._kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> │ │ │ └ (<weakref at 0x7ff415548bd0; to 'ThreadPoolExecutor' at 0x7ff415555110>, <_queue.SimpleQueue object at 0x7ff41552fb00>, None,... │ │ └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> │ └ <function _worker at 0x7ff45a7109a0> └ <Thread(ThreadPoolExecutor-9_1, started 140685864851136)> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 83, in _worker work_item.run() │ └ <function _WorkItem.run at 0x7ff45a710ae0> └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) │ │ │ │ │ └ {} │ │ │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> │ │ │ └ ('file-838f86ce-7522-4b97-a9c1-affbd9a608e2', 'support policy guidelines for open-source projects') │ │ └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> │ └ <function query_collection_with_hybrid_search.<locals>.process_query at 0x7ff415541080> └ <concurrent.futures.thread._WorkItem object at 0x7ff4155b6450> > File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query result = query_doc_with_hybrid_search( └ <function query_doc_with_hybrid_search at 0x7ff4205a79c0> File "/app/backend/open_webui/retrieval/utils.py", line 192, in query_doc_with_hybrid_search raise e File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search result = compression_retriever.invoke(query) │ │ └ 'support policy guidelines for open-source projects' │ └ <function BaseRetriever.invoke at 0x7ff4207b1c60> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke result = self._get_relevant_documents( │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7ff4207b19e0> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( │ │ └ <function RerankCompressor.compress_documents at 0x7ff4205a7e20> │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7ff41693... └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/app/backend/open_webui/retrieval/utils.py", line 963, in compress_documents zip(documents, scores.tolist() if not isinstance(scores, list) else scores) │ │ │ └ None │ │ └ None │ └ None └ [Document(metadata={'Content-Type': 'text/plain; charset=UTF-8', 'X-Tika-PDFextractInlineImages': 'true', 'created_by': '292e... AttributeError: 'NoneType' object has no attribute 'tolist' 2025-08-16 00:42:13.161 | INFO | open_webui.routers.openai:get_all_models:397 - get_all_models() ``` Similar to: https://github.com/open-webui/open-webui/issues/16228#issuecomment-3146579709: ``` sudo docker exec -it open-webui curl http://127.0.0.1:10005/v1/rerank \ -H "Content-Type: application/json" \ -d '{ "model": "klnstpr/bge-reranker-v2-m3", "query": "Organic skincare products for sensitive skin", "documents": [ "Eco-friendly kitchenware for modern homes", "Biodegradable cleaning supplies for eco-conscious consumers", "Organic cotton baby clothes for sensitive skin", "Natural organic skincare range for sensitive skin", "Tech gadgets for smart homes: 2024 edition", "Sustainable gardening tools and compost solutions", "Sensitive skin-friendly facial cleansers and toners", "Organic food wraps and storage solutions", "All-natural pet food for dogs with allergies", "Yoga mats made from recycled materials" ], "top_n": 3 }' {"model":"klnstpr/bge-reranker-v2-m3","object":"list","usage":{"prompt_tokens":195,"total_tokens":195},"results":[{"index":0,"relevance_score":-10.885379791259766},{"index":1,"relevance_score":-8.234384536743164},{"index":2,"relevance_score":-0.7093545794487},{"index":3,"relevance_score":7.277642250061035},{"index":4,"relevance_score":-11.046594619750977},{"index":5,"relevance_score":-11.040145874023438},{"index":6,"relevance_score":0.6702241897583008},{"index":7,"relevance_score":-7.417957782745361},{"index":8,"relevance_score":-9.351853370666504},{"index":9,"relevance_score":-10.990015029907227}]}% ```
Author
Owner

@rgaricano commented on GitHub (Aug 16, 2025):

For reference:
438e5d966f/backend/open_webui/retrieval/utils.py (L962-L968)

fix

         if scores: 
            docs_with_scores = list(
                zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
            )
            if self.r_score:
                docs_with_scores = [
                    (d, s) for d, s in docs_with_scores if s >= self.r_score
                ]
         
@rgaricano commented on GitHub (Aug 16, 2025): For reference: https://github.com/open-webui/open-webui/blob/438e5d966f0f64f9ea3feab22724a5bd96a4127b/backend/open_webui/retrieval/utils.py#L962-L968 fix ``` if scores: docs_with_scores = list( zip(documents, scores.tolist() if not isinstance(scores, list) else scores) ) if self.r_score: docs_with_scores = [ (d, s) for d, s in docs_with_scores if s >= self.r_score ] ```
Author
Owner

@sbutler2901 commented on GitHub (Aug 16, 2025):

Hey @rgaricano, unfortunately that does not fix the issue. For me, this leads to the docs_with_scores variable being undefined:

UnboundLocalError: cannot access local variable 'docs_with_scores' where it is not associated with a value
2025-08-16 17:01:08.668 | ERROR    | open_webui.retrieval.utils:process_query:371 - Error when querying the collection with hybrid_search: cannot access local variable 'docs_with_scores' where it is not associated with a value

Trying with:

        if scores:
            docs_with_scores = list(
                zip(documents, scores.tolist() if not isinstance(scores, list) else scores)
            )
            if self.r_score:
                docs_with_scores = [
                    (d, s) for d, s in docs_with_scores if s >= self.r_score
                ]
        else:
            docs_with_scores = []

Results in:

  File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query
    result = query_doc_with_hybrid_search(
             └ <function query_doc_with_hybrid_search at 0x7fb39f0079c0>

  File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search
    result = compression_retriever.invoke(query)
             │                     │      └ 'Methods for analyzing document content'
             │                     └ <function BaseRetriever.invoke at 0x7fb39f219c60>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke
    result = self._get_relevant_documents(
             │    └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7fb39f2199e0>
             └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...
  File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents
    compressed_docs = self.base_compressor.compress_documents(
                      │    │               └ <function RerankCompressor.compress_documents at 0x7fb39f007e20>
                      │    └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7fb39994...
                      └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l...

  File "/app/backend/open_webui/retrieval/utils.py", line 950, in compress_documents
    scores = self.reranking_function(
             │    └ <function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7fb399d845e0>
             └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7fb39994...

  File "/app/backend/open_webui/utils/middleware.py", line 659, in <lambda>
    lambda sentences: request.app.state.RERANKING_FUNCTION(
           │          │       └ <property object at 0x7fb3da771c60>
           │          └ <starlette.requests.Request object at 0x7fb3999aec50>
           └ [('Methods for analyzing document content', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport { TopBanners...

  File "/app/backend/open_webui/retrieval/utils.py", line 452, in <lambda>
    return lambda sentences, user=None: reranking_function.predict(
                  │                     │                  └ <function ExternalReranker.predict at 0x7fb39b5793a0>
                  │                     └ <open_webui.retrieval.models.external.ExternalReranker object at 0x7fb39a11d790>
                  └ [('Methods for analyzing document content', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport { TopBanners...

> File "/app/backend/open_webui/retrieval/models/external.py", line 62, in predict
    r.raise_for_status()
    │ └ <function Response.raise_for_status at 0x7fb3da95d300>
    └ <Response [500]>

  File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
          │         │                        └ <Response [500]>
          │         └ '500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank'
          └ <class 'requests.exceptions.HTTPError'>

requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank
2025-08-16 17:14:37.387 | INFO     | open_webui.retrieval.utils:query_doc_with_hybrid_search:185 - query_doc_with_hybrid_search:result [[]] [[]]
2025-08-16 17:14:37.387 | INFO     | open_webui.retrieval.utils:query_doc_with_hybrid_search:185 - query_doc_with_hybrid_search:result [[]] [[]]
2025-08-16 17:14:37.387 | INFO     | open_webui.retrieval.utils:query_doc_with_hybrid_search:185 - query_doc_with_hybrid_search:result [[]] [[]]
2025-08-16 17:14:37.390 | INFO     | open_webui.routers.openai:get_all_models:397 - get_all_models()
@sbutler2901 commented on GitHub (Aug 16, 2025): Hey @rgaricano, unfortunately that does not fix the issue. For me, this leads to the `docs_with_scores` variable being undefined: ``` UnboundLocalError: cannot access local variable 'docs_with_scores' where it is not associated with a value 2025-08-16 17:01:08.668 | ERROR | open_webui.retrieval.utils:process_query:371 - Error when querying the collection with hybrid_search: cannot access local variable 'docs_with_scores' where it is not associated with a value ``` Trying with: ```python if scores: docs_with_scores = list( zip(documents, scores.tolist() if not isinstance(scores, list) else scores) ) if self.r_score: docs_with_scores = [ (d, s) for d, s in docs_with_scores if s >= self.r_score ] else: docs_with_scores = [] ``` Results in: ``` File "/app/backend/open_webui/retrieval/utils.py", line 358, in process_query result = query_doc_with_hybrid_search( └ <function query_doc_with_hybrid_search at 0x7fb39f0079c0> File "/app/backend/open_webui/retrieval/utils.py", line 165, in query_doc_with_hybrid_search result = compression_retriever.invoke(query) │ │ └ 'Methods for analyzing document content' │ └ <function BaseRetriever.invoke at 0x7fb39f219c60> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain_core/retrievers.py", line 261, in invoke result = self._get_relevant_documents( │ └ <function ContextualCompressionRetriever._get_relevant_documents at 0x7fb39f2199e0> └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/usr/local/lib/python3.11/site-packages/langchain/retrievers/contextual_compression.py", line 44, in _get_relevant_documents compressed_docs = self.base_compressor.compress_documents( │ │ └ <function RerankCompressor.compress_documents at 0x7fb39f007e20> │ └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7fb39994... └ ContextualCompressionRetriever(base_compressor=RerankCompressor(embedding_function=<function chat_completion_files_handler.<l... File "/app/backend/open_webui/retrieval/utils.py", line 950, in compress_documents scores = self.reranking_function( │ └ <function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7fb399d845e0> └ RerankCompressor(embedding_function=<function chat_completion_files_handler.<locals>.<lambda>.<locals>.<lambda> at 0x7fb39994... File "/app/backend/open_webui/utils/middleware.py", line 659, in <lambda> lambda sentences: request.app.state.RERANKING_FUNCTION( │ │ └ <property object at 0x7fb3da771c60> │ └ <starlette.requests.Request object at 0x7fb3999aec50> └ [('Methods for analyzing document content', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport { TopBanners... File "/app/backend/open_webui/retrieval/utils.py", line 452, in <lambda> return lambda sentences, user=None: reranking_function.predict( │ │ └ <function ExternalReranker.predict at 0x7fb39b5793a0> │ └ <open_webui.retrieval.models.external.ExternalReranker object at 0x7fb39a11d790> └ [('Methods for analyzing document content', '---\nsidebar_position: 1600\ntitle: "🤝 Contributing"\n---\n\nimport { TopBanners... > File "/app/backend/open_webui/retrieval/models/external.py", line 62, in predict r.raise_for_status() │ └ <function Response.raise_for_status at 0x7fb3da95d300> └ <Response [500]> File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1026, in raise_for_status raise HTTPError(http_error_msg, response=self) │ │ └ <Response [500]> │ └ '500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank' └ <class 'requests.exceptions.HTTPError'> requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank 2025-08-16 17:14:37.387 | INFO | open_webui.retrieval.utils:query_doc_with_hybrid_search:185 - query_doc_with_hybrid_search:result [[]] [[]] 2025-08-16 17:14:37.387 | INFO | open_webui.retrieval.utils:query_doc_with_hybrid_search:185 - query_doc_with_hybrid_search:result [[]] [[]] 2025-08-16 17:14:37.387 | INFO | open_webui.retrieval.utils:query_doc_with_hybrid_search:185 - query_doc_with_hybrid_search:result [[]] [[]] 2025-08-16 17:14:37.390 | INFO | open_webui.routers.openai:get_all_models:397 - get_all_models() ```
Author
Owner

@rgaricano commented on GitHub (Aug 17, 2025):

@sbutler2901
try just returning same input documents:

else:
    return documents
@rgaricano commented on GitHub (Aug 17, 2025): @sbutler2901 try just returning same input documents: ``` else: return documents ```
Author
Owner

@tjbck commented on GitHub (Aug 21, 2025):

@sbutler2901 '500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank' looks like your rerank engine isn't reachable, have you ruled that out first?

@tjbck commented on GitHub (Aug 21, 2025): @sbutler2901 `'500 Server Error: Internal Server Error for url: http://127.0.0.1:9292/v1/rerank'` looks like your rerank engine isn't reachable, have you ruled that out first?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#5829