[GH-ISSUE #18592] issue: TypeError/IndexError while inserting embeddings using pgvector and cohere-embed-v4 #57312

Closed
opened 2026-05-05 20:50:31 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @ecktom on GitHub (Oct 24, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/18592

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.6.34

Ollama Version (if applicable)

No response

Operating System

Docker

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

After uploading a PDF to some knowledge base, embeddings should be stored correctly

Actual Behavior

A TypeError is raised within /app/backend/open_webui/retrieval/vector/dbs/pgvector.py, the PDF is not getting stored.

If setting RAG_EMBEDDING_OPENAI_BATCH_SIZE to eg. 3 an IndexError appears.

Steps to Reproduce

  1. Set up pgvector as vector database
  2. Set up cohere-embed-v4 as OpenAI based engine for embeddings
  3. Try uploading a document to some knowledge base

Logs & Screenshots

TypeError:

[openwebui] | 2025-10-24 11:41:05.845 | INFO     | open_webui.routers.retrieval:save_docs_to_vector_db:1300 - save_docs_to_vector_db: document <redacted>.pdf 8021f49c-01b9-4e32-b082-df23f08dedfd
[openwebui] | 2025-10-24 11:41:05.911 | INFO     | open_webui.routers.retrieval:save_docs_to_vector_db:1416 - generating embeddings for 8021f49c-01b9-4e32-b082-df23f08dedfd
[litellm]   | INFO:     10.89.0.2:40868 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40884 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40892 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40908 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40922 - "POST /v1/embeddings HTTP/1.1" 200 OK
[openwebui] | 2025-10-24 11:41:07.725 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.127.1:36434 - "GET /_app/version.json HTTP/1.1" 200
[litellm]   | INFO:     10.89.0.2:40938 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40940 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40952 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40956 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40966 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40970 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40978 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:40984 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41000 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41012 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41018 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41034 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41036 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41046 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41050 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41062 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41072 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41078 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41094 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41110 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:41112 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56444 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56454 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56464 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56480 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56486 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56490 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56496 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56504 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56520 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56528 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56538 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56548 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56558 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56564 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56568 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56580 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56582 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56598 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56604 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56606 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56622 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56626 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56640 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56654 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56664 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56666 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56678 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:56684 - "POST /v1/embeddings HTTP/1.1" 200 OK
[openwebui] | 2025-10-24 11:41:22.460 | INFO     | open_webui.routers.retrieval:save_docs_to_vector_db:1452 - embeddings generated 54 for 54 items
[openwebui] | 2025-10-24 11:41:22.463 | INFO     | open_webui.routers.retrieval:save_docs_to_vector_db:1464 - adding to collection 8021f49c-01b9-4e32-b082-df23f08dedfd
[openwebui] | 2025-10-24 11:41:22.466 | ERROR    | open_webui.retrieval.vector.dbs.pgvector:insert:278 - Error during insert: can only concatenate str (not "list") to str
[openwebui] | Traceback (most recent call last):
[openwebui] |
[openwebui] |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
[openwebui] |     self._bootstrap_inner()
[openwebui] |     │    └ <function Thread._bootstrap_inner at 0xffffa1d149a0>
[openwebui] |     └ <WorkerThread(AnyIO worker thread, started 281469515911584)>
[openwebui] |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
[openwebui] |     self.run()
[openwebui] |     │    └ <function WorkerThread.run at 0xffff33099940>
[openwebui] |     └ <WorkerThread(AnyIO worker thread, started 281469515911584)>
[openwebui] |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 976, in run
[openwebui] |     result = context.run(func, *args)
[openwebui] |              │       │   │      └ ()
[openwebui] |              │       │   └ functools.partial(<function add_file_to_knowledge_by_id at 0xffff603919e0>, user=UserModel(id='112cffcb-4ea1-4818-9fd1-91d309...
[openwebui] |              │       └ <method 'run' of '_contextvars.Context' objects>
[openwebui] |              └ <_contextvars.Context object at 0xffff3326a800>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/knowledge.py", line 398, in add_file_to_knowledge_by_id
[openwebui] |     process_file(
[openwebui] |     └ <function process_file at 0xffff604aca40>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1656, in process_file
[openwebui] |     result = save_docs_to_vector_db(
[openwebui] |              └ <function save_docs_to_vector_db at 0xffff6044bba0>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1465, in save_docs_to_vector_db
[openwebui] |     VECTOR_DB_CLIENT.insert(
[openwebui] |     │                └ <function PgvectorClient.insert at 0xffff650a8180>
[openwebui] |     └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110>
[openwebui] |
[openwebui] | > File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 262, in insert
[openwebui] |     vector = self.adjust_vector_length(item["vector"])
[openwebui] |              │    │                    └ {'id': 'eb95aa2f-2b85-42a6-b0a7-ca2b3845b7b1', 'text': '<redacted>...
[openwebui] |              │    └ <function PgvectorClient.adjust_vector_length at 0xffff650a80e0>
[openwebui] |              └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 220, in adjust_vector_length
[openwebui] |     vector += [0.0] * (VECTOR_LENGTH - current_length)
[openwebui] |     │                  │               └ 5
[openwebui] |     │                  └ 1536
[openwebui] |     └ 'float'
[openwebui] |
[openwebui] | TypeError: can only concatenate str (not "list") to str
[openwebui] | 2025-10-24 11:41:22.475 | ERROR    | open_webui.routers.retrieval:save_docs_to_vector_db:1473 - can only concatenate str (not "list") to str
[openwebui] | Traceback (most recent call last):
[openwebui] |
[openwebui] |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
[openwebui] |     self._bootstrap_inner()
[openwebui] |     │    └ <function Thread._bootstrap_inner at 0xffffa1d149a0>
[openwebui] |     └ <WorkerThread(AnyIO worker thread, started 281469515911584)>
[openwebui] |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
[openwebui] |     self.run()
[openwebui] |     │    └ <function WorkerThread.run at 0xffff33099940>
[openwebui] |     └ <WorkerThread(AnyIO worker thread, started 281469515911584)>
[openwebui] |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 976, in run
[openwebui] |     result = context.run(func, *args)
[openwebui] |              │       │   │      └ ()
[openwebui] |              │       │   └ functools.partial(<function add_file_to_knowledge_by_id at 0xffff603919e0>, user=UserModel(id='112cffcb-4ea1-4818-9fd1-91d309...
[openwebui] |              │       └ <method 'run' of '_contextvars.Context' objects>
[openwebui] |              └ <_contextvars.Context object at 0xffff3326a800>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/knowledge.py", line 398, in add_file_to_knowledge_by_id
[openwebui] |     process_file(
[openwebui] |     └ <function process_file at 0xffff604aca40>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1656, in process_file
[openwebui] |     result = save_docs_to_vector_db(
[openwebui] |              └ <function save_docs_to_vector_db at 0xffff6044bba0>
[openwebui] |
[openwebui] | > File "/app/backend/open_webui/routers/retrieval.py", line 1465, in save_docs_to_vector_db
[openwebui] |     VECTOR_DB_CLIENT.insert(
[openwebui] |     │                └ <function PgvectorClient.insert at 0xffff650a8180>
[openwebui] |     └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 262, in insert
[openwebui] |     vector = self.adjust_vector_length(item["vector"])
[openwebui] |              │    │                    └ {'id': 'eb95aa2f-2b85-42a6-b0a7-ca2b3845b7b1', 'text': '<redacted>...
[openwebui] |              │    └ <function PgvectorClient.adjust_vector_length at 0xffff650a80e0>
[openwebui] |              └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 220, in adjust_vector_length
[openwebui] |     vector += [0.0] * (VECTOR_LENGTH - current_length)
[openwebui] |     │                  │               └ 5
[openwebui] |     │                  └ 1536
[openwebui] |     └ 'float'
[openwebui] |
[openwebui] | TypeError: can only concatenate str (not "list") to str
[openwebui] | 2025-10-24 11:41:22.475 | ERROR    | open_webui.routers.retrieval:process_file:1695 - can only concatenate str (not "list") to str
[openwebui] | Traceback (most recent call last):
[openwebui] |
[openwebui] |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
[openwebui] |     self._bootstrap_inner()
[openwebui] |     │    └ <function Thread._bootstrap_inner at 0xffffa1d149a0>
[openwebui] |     └ <WorkerThread(AnyIO worker thread, started 281469515911584)>
[openwebui] |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
[openwebui] |     self.run()
[openwebui] |     │    └ <function WorkerThread.run at 0xffff33099940>
[openwebui] |     └ <WorkerThread(AnyIO worker thread, started 281469515911584)>
[openwebui] |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 976, in run
[openwebui] |     result = context.run(func, *args)
[openwebui] |              │       │   │      └ ()
[openwebui] |              │       │   └ functools.partial(<function add_file_to_knowledge_by_id at 0xffff603919e0>, user=UserModel(id='112cffcb-4ea1-4818-9fd1-91d309...
[openwebui] |              │       └ <method 'run' of '_contextvars.Context' objects>
[openwebui] |              └ <_contextvars.Context object at 0xffff3326a800>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/knowledge.py", line 398, in add_file_to_knowledge_by_id
[openwebui] |     process_file(
[openwebui] |     └ <function process_file at 0xffff604aca40>
[openwebui] |
[openwebui] | > File "/app/backend/open_webui/routers/retrieval.py", line 1692, in process_file
[openwebui] |     raise e
[openwebui] |           └ TypeError('can only concatenate str (not "list") to str')
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1656, in process_file
[openwebui] |     result = save_docs_to_vector_db(
[openwebui] |              └ <function save_docs_to_vector_db at 0xffff6044bba0>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1474, in save_docs_to_vector_db
[openwebui] |     raise e
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1465, in save_docs_to_vector_db
[openwebui] |     VECTOR_DB_CLIENT.insert(
[openwebui] |     │                └ <function PgvectorClient.insert at 0xffff650a8180>
[openwebui] |     └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 262, in insert
[openwebui] |     vector = self.adjust_vector_length(item["vector"])
[openwebui] |              │    │                    └ {'id': 'eb95aa2f-2b85-42a6-b0a7-ca2b3845b7b1', 'text': '<redacted>...
[openwebui] |              │    └ <function PgvectorClient.adjust_vector_length at 0xffff650a80e0>
[openwebui] |              └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 220, in adjust_vector_length
[openwebui] |     vector += [0.0] * (VECTOR_LENGTH - current_length)
[openwebui] |     │                  │               └ 5
[openwebui] |     │                  └ 1536
[openwebui] |     └ 'float'
[openwebui] |
[openwebui] | TypeError: can only concatenate str (not "list") to str

IndexError (using RAG_EMBEDDING_OPENAI_BATCH_SIZE > 1):

[openwebui] | 2025-10-24 15:23:34.532 | INFO     | open_webui.routers.retrieval:save_docs_to_vector_db:1300 - save_docs_to_vector_db: document <redacted>.pdf file-a799a550-0adb-4691-9575-6ccad6d286fa
[openwebui] | 2025-10-24 15:23:34.805 | INFO     | open_webui.routers.retrieval:save_docs_to_vector_db:1416 - generating embeddings for file-a799a550-0adb-4691-9575-6ccad6d286fa
[litellm]   | INFO:     127.0.0.1:37126 - "GET /health/liveliness HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58760 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58768 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58782 - "POST /v1/embeddings HTTP/1.1" 200 OK
[openwebui] | 2025-10-24 15:23:37.011 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.127.1:26177 - "GET /_app/version.json HTTP/1.1" 200
[litellm]   | INFO:     10.89.0.2:58792 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58794 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58802 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58814 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58822 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58824 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58838 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58850 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58854 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58858 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58868 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:58884 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:49970 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:49984 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:49988 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:49996 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:50010 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:50024 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:50028 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:50034 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:50046 - "POST /v1/embeddings HTTP/1.1" 200 OK
[litellm]   | INFO:     10.89.0.2:50048 - "POST /v1/embeddings HTTP/1.1" 200 OK
[openwebui] | 2025-10-24 15:23:47.046 | INFO     | open_webui.routers.retrieval:save_docs_to_vector_db:1452 - embeddings generated 25 for 73 items
[openwebui] | 2025-10-24 15:23:47.049 | ERROR    | open_webui.routers.retrieval:save_docs_to_vector_db:1473 - list index out of range
[openwebui] | Traceback (most recent call last):
[openwebui] |
[openwebui] |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
[openwebui] |     self._bootstrap_inner()
[openwebui] |     │    └ <function Thread._bootstrap_inner at 0xffffa1d149a0>
[openwebui] |     └ <WorkerThread(AnyIO worker thread, started 281469507457440)>
[openwebui] |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
[openwebui] |     self.run()
[openwebui] |     │    └ <function WorkerThread.run at 0xffff33099940>
[openwebui] |     └ <WorkerThread(AnyIO worker thread, started 281469507457440)>
[openwebui] |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 976, in run
[openwebui] |     result = context.run(func, *args)
[openwebui] |              │       │   │      └ ()
[openwebui] |              │       │   └ functools.partial(<function process_uploaded_file at 0xffff64f4df80>, <starlette.requests.Request object at 0xffff32d7d590>, ...
[openwebui] |              │       └ <method 'run' of '_contextvars.Context' objects>
[openwebui] |              └ <_contextvars.Context object at 0xfffef207fe80>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/files.py", line 117, in process_uploaded_file
[openwebui] |     process_file(request, ProcessFileForm(file_id=file_item.id), user=user)
[openwebui] |     │            │        │                       │         │         └ UserModel(id='112cffcb-4ea1-4818-9fd1-91d309b17d34',...
[openwebui] |     │            │        │                       │         └ 'a799a550-0adb-4691-9575-6ccad6d286fa'
[openwebui] |     │            │        │                       └ FileModel(id='a799a550-0adb-4691-9575-6ccad6d286fa', user_id='112cffcb-4ea1-4818-9fd1-91d309b17d34', hash=None, filename='FRS...
[openwebui] |     │            │        └ <class 'open_webui.routers.retrieval.ProcessFileForm'>
[openwebui] |     │            └ <starlette.requests.Request object at 0xffff32d7d590>
[openwebui] |     └ <function process_file at 0xffff604aca40>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1656, in process_file
[openwebui] |     result = save_docs_to_vector_db(
[openwebui] |              └ <function save_docs_to_vector_db at 0xffff6044bba0>
[openwebui] |
[openwebui] | > File "/app/backend/open_webui/routers/retrieval.py", line 1454, in save_docs_to_vector_db
[openwebui] |     items = [
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1458, in <listcomp>
[openwebui] |     "vector": embeddings[idx],
[openwebui] |               │          └ 25
[openwebui] |               └ ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float'...
[openwebui] |
[openwebui] | IndexError: list index out of range
[openwebui] | 2025-10-24 15:23:47.058 | ERROR    | open_webui.routers.retrieval:process_file:1695 - list index out of range
[openwebui] | Traceback (most recent call last):
[openwebui] |
[openwebui] |   File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap
[openwebui] |     self._bootstrap_inner()
[openwebui] |     │    └ <function Thread._bootstrap_inner at 0xffffa1d149a0>
[openwebui] |     └ <WorkerThread(AnyIO worker thread, started 281469507457440)>
[openwebui] |   File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner
[openwebui] |     self.run()
[openwebui] |     │    └ <function WorkerThread.run at 0xffff33099940>
[openwebui] |     └ <WorkerThread(AnyIO worker thread, started 281469507457440)>
[openwebui] |   File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 976, in run
[openwebui] |     result = context.run(func, *args)
[openwebui] |              │       │   │      └ ()
[openwebui] |              │       │   └ functools.partial(<function process_uploaded_file at 0xffff64f4df80>, <starlette.requests.Request object at 0xffff32d7d590>, ...
[openwebui] |              │       └ <method 'run' of '_contextvars.Context' objects>
[openwebui] |              └ <_contextvars.Context object at 0xfffef207fe80>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/files.py", line 117, in process_uploaded_file
[openwebui] |     process_file(request, ProcessFileForm(file_id=file_item.id), user=user)
[openwebui] |     │            │        │                       │         │         └ UserModel(id='112cffcb-4ea1-4818-9fd1-91d309b17d34',...
[openwebui] |     │            │        │                       │         └ 'a799a550-0adb-4691-9575-6ccad6d286fa'
[openwebui] |     │            │        │                       └ FileModel(id='a799a550-0adb-4691-9575-6ccad6d286fa', user_id='112cffcb-4ea1-4818-9fd1-91d309b17d34', hash=None, filename='FRS...
[openwebui] |     │            │        └ <class 'open_webui.routers.retrieval.ProcessFileForm'>
[openwebui] |     │            └ <starlette.requests.Request object at 0xffff32d7d590>
[openwebui] |     └ <function process_file at 0xffff604aca40>
[openwebui] |
[openwebui] | > File "/app/backend/open_webui/routers/retrieval.py", line 1692, in process_file
[openwebui] |     raise e
[openwebui] |           └ IndexError('list index out of range')
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1656, in process_file
[openwebui] |     result = save_docs_to_vector_db(
[openwebui] |              └ <function save_docs_to_vector_db at 0xffff6044bba0>
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1474, in save_docs_to_vector_db
[openwebui] |     raise e
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1454, in save_docs_to_vector_db
[openwebui] |     items = [
[openwebui] |
[openwebui] |   File "/app/backend/open_webui/routers/retrieval.py", line 1458, in <listcomp>
[openwebui] |     "vector": embeddings[idx],
[openwebui] |               │          └ 25
[openwebui] |               └ ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float'...
[openwebui] |
[openwebui] | IndexError: list index out of range
[openwebui] | 2025-10-24 15:23:47.209 | ERROR    | open_webui.routers.files:process_uploaded_file:124 - Error processing file: a799a550-0adb-4691-9575-6ccad6d286fa

Additional Information

  • Within our setup, we're using Models hosted on AWS Bedrock and consume them via LiteLLM.
Originally created by @ecktom on GitHub (Oct 24, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/18592 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.6.34 ### Ollama Version (if applicable) _No response_ ### Operating System Docker ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior After uploading a PDF to some knowledge base, embeddings should be stored correctly ### Actual Behavior A `TypeError` is raised within` /app/backend/open_webui/retrieval/vector/dbs/pgvector.py`, the PDF is not getting stored. If setting `RAG_EMBEDDING_OPENAI_BATCH_SIZE` to eg. `3` an `IndexError` appears. ### Steps to Reproduce 1) Set up `pgvector` as vector database 2) Set up `cohere-embed-v4` as OpenAI based engine for embeddings 3) Try uploading a document to some knowledge base ### Logs & Screenshots TypeError: ``` [openwebui] | 2025-10-24 11:41:05.845 | INFO | open_webui.routers.retrieval:save_docs_to_vector_db:1300 - save_docs_to_vector_db: document <redacted>.pdf 8021f49c-01b9-4e32-b082-df23f08dedfd [openwebui] | 2025-10-24 11:41:05.911 | INFO | open_webui.routers.retrieval:save_docs_to_vector_db:1416 - generating embeddings for 8021f49c-01b9-4e32-b082-df23f08dedfd [litellm] | INFO: 10.89.0.2:40868 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40884 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40892 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40908 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40922 - "POST /v1/embeddings HTTP/1.1" 200 OK [openwebui] | 2025-10-24 11:41:07.725 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.127.1:36434 - "GET /_app/version.json HTTP/1.1" 200 [litellm] | INFO: 10.89.0.2:40938 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40940 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40952 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40956 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40966 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40970 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40978 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:40984 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41000 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41012 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41018 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41034 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41036 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41046 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41050 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41062 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41072 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41078 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41094 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41110 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:41112 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56444 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56454 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56464 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56480 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56486 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56490 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56496 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56504 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56520 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56528 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56538 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56548 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56558 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56564 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56568 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56580 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56582 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56598 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56604 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56606 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56622 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56626 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56640 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56654 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56664 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56666 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56678 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:56684 - "POST /v1/embeddings HTTP/1.1" 200 OK [openwebui] | 2025-10-24 11:41:22.460 | INFO | open_webui.routers.retrieval:save_docs_to_vector_db:1452 - embeddings generated 54 for 54 items [openwebui] | 2025-10-24 11:41:22.463 | INFO | open_webui.routers.retrieval:save_docs_to_vector_db:1464 - adding to collection 8021f49c-01b9-4e32-b082-df23f08dedfd [openwebui] | 2025-10-24 11:41:22.466 | ERROR | open_webui.retrieval.vector.dbs.pgvector:insert:278 - Error during insert: can only concatenate str (not "list") to str [openwebui] | Traceback (most recent call last): [openwebui] | [openwebui] | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap [openwebui] | self._bootstrap_inner() [openwebui] | │ └ <function Thread._bootstrap_inner at 0xffffa1d149a0> [openwebui] | └ <WorkerThread(AnyIO worker thread, started 281469515911584)> [openwebui] | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner [openwebui] | self.run() [openwebui] | │ └ <function WorkerThread.run at 0xffff33099940> [openwebui] | └ <WorkerThread(AnyIO worker thread, started 281469515911584)> [openwebui] | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 976, in run [openwebui] | result = context.run(func, *args) [openwebui] | │ │ │ └ () [openwebui] | │ │ └ functools.partial(<function add_file_to_knowledge_by_id at 0xffff603919e0>, user=UserModel(id='112cffcb-4ea1-4818-9fd1-91d309... [openwebui] | │ └ <method 'run' of '_contextvars.Context' objects> [openwebui] | └ <_contextvars.Context object at 0xffff3326a800> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/knowledge.py", line 398, in add_file_to_knowledge_by_id [openwebui] | process_file( [openwebui] | └ <function process_file at 0xffff604aca40> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1656, in process_file [openwebui] | result = save_docs_to_vector_db( [openwebui] | └ <function save_docs_to_vector_db at 0xffff6044bba0> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1465, in save_docs_to_vector_db [openwebui] | VECTOR_DB_CLIENT.insert( [openwebui] | │ └ <function PgvectorClient.insert at 0xffff650a8180> [openwebui] | └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110> [openwebui] | [openwebui] | > File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 262, in insert [openwebui] | vector = self.adjust_vector_length(item["vector"]) [openwebui] | │ │ └ {'id': 'eb95aa2f-2b85-42a6-b0a7-ca2b3845b7b1', 'text': '<redacted>... [openwebui] | │ └ <function PgvectorClient.adjust_vector_length at 0xffff650a80e0> [openwebui] | └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110> [openwebui] | [openwebui] | File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 220, in adjust_vector_length [openwebui] | vector += [0.0] * (VECTOR_LENGTH - current_length) [openwebui] | │ │ └ 5 [openwebui] | │ └ 1536 [openwebui] | └ 'float' [openwebui] | [openwebui] | TypeError: can only concatenate str (not "list") to str [openwebui] | 2025-10-24 11:41:22.475 | ERROR | open_webui.routers.retrieval:save_docs_to_vector_db:1473 - can only concatenate str (not "list") to str [openwebui] | Traceback (most recent call last): [openwebui] | [openwebui] | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap [openwebui] | self._bootstrap_inner() [openwebui] | │ └ <function Thread._bootstrap_inner at 0xffffa1d149a0> [openwebui] | └ <WorkerThread(AnyIO worker thread, started 281469515911584)> [openwebui] | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner [openwebui] | self.run() [openwebui] | │ └ <function WorkerThread.run at 0xffff33099940> [openwebui] | └ <WorkerThread(AnyIO worker thread, started 281469515911584)> [openwebui] | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 976, in run [openwebui] | result = context.run(func, *args) [openwebui] | │ │ │ └ () [openwebui] | │ │ └ functools.partial(<function add_file_to_knowledge_by_id at 0xffff603919e0>, user=UserModel(id='112cffcb-4ea1-4818-9fd1-91d309... [openwebui] | │ └ <method 'run' of '_contextvars.Context' objects> [openwebui] | └ <_contextvars.Context object at 0xffff3326a800> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/knowledge.py", line 398, in add_file_to_knowledge_by_id [openwebui] | process_file( [openwebui] | └ <function process_file at 0xffff604aca40> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1656, in process_file [openwebui] | result = save_docs_to_vector_db( [openwebui] | └ <function save_docs_to_vector_db at 0xffff6044bba0> [openwebui] | [openwebui] | > File "/app/backend/open_webui/routers/retrieval.py", line 1465, in save_docs_to_vector_db [openwebui] | VECTOR_DB_CLIENT.insert( [openwebui] | │ └ <function PgvectorClient.insert at 0xffff650a8180> [openwebui] | └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110> [openwebui] | [openwebui] | File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 262, in insert [openwebui] | vector = self.adjust_vector_length(item["vector"]) [openwebui] | │ │ └ {'id': 'eb95aa2f-2b85-42a6-b0a7-ca2b3845b7b1', 'text': '<redacted>... [openwebui] | │ └ <function PgvectorClient.adjust_vector_length at 0xffff650a80e0> [openwebui] | └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110> [openwebui] | [openwebui] | File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 220, in adjust_vector_length [openwebui] | vector += [0.0] * (VECTOR_LENGTH - current_length) [openwebui] | │ │ └ 5 [openwebui] | │ └ 1536 [openwebui] | └ 'float' [openwebui] | [openwebui] | TypeError: can only concatenate str (not "list") to str [openwebui] | 2025-10-24 11:41:22.475 | ERROR | open_webui.routers.retrieval:process_file:1695 - can only concatenate str (not "list") to str [openwebui] | Traceback (most recent call last): [openwebui] | [openwebui] | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap [openwebui] | self._bootstrap_inner() [openwebui] | │ └ <function Thread._bootstrap_inner at 0xffffa1d149a0> [openwebui] | └ <WorkerThread(AnyIO worker thread, started 281469515911584)> [openwebui] | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner [openwebui] | self.run() [openwebui] | │ └ <function WorkerThread.run at 0xffff33099940> [openwebui] | └ <WorkerThread(AnyIO worker thread, started 281469515911584)> [openwebui] | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 976, in run [openwebui] | result = context.run(func, *args) [openwebui] | │ │ │ └ () [openwebui] | │ │ └ functools.partial(<function add_file_to_knowledge_by_id at 0xffff603919e0>, user=UserModel(id='112cffcb-4ea1-4818-9fd1-91d309... [openwebui] | │ └ <method 'run' of '_contextvars.Context' objects> [openwebui] | └ <_contextvars.Context object at 0xffff3326a800> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/knowledge.py", line 398, in add_file_to_knowledge_by_id [openwebui] | process_file( [openwebui] | └ <function process_file at 0xffff604aca40> [openwebui] | [openwebui] | > File "/app/backend/open_webui/routers/retrieval.py", line 1692, in process_file [openwebui] | raise e [openwebui] | └ TypeError('can only concatenate str (not "list") to str') [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1656, in process_file [openwebui] | result = save_docs_to_vector_db( [openwebui] | └ <function save_docs_to_vector_db at 0xffff6044bba0> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1474, in save_docs_to_vector_db [openwebui] | raise e [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1465, in save_docs_to_vector_db [openwebui] | VECTOR_DB_CLIENT.insert( [openwebui] | │ └ <function PgvectorClient.insert at 0xffff650a8180> [openwebui] | └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110> [openwebui] | [openwebui] | File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 262, in insert [openwebui] | vector = self.adjust_vector_length(item["vector"]) [openwebui] | │ │ └ {'id': 'eb95aa2f-2b85-42a6-b0a7-ca2b3845b7b1', 'text': '<redacted>... [openwebui] | │ └ <function PgvectorClient.adjust_vector_length at 0xffff650a80e0> [openwebui] | └ <open_webui.retrieval.vector.dbs.pgvector.PgvectorClient object at 0xffff6529a110> [openwebui] | [openwebui] | File "/app/backend/open_webui/retrieval/vector/dbs/pgvector.py", line 220, in adjust_vector_length [openwebui] | vector += [0.0] * (VECTOR_LENGTH - current_length) [openwebui] | │ │ └ 5 [openwebui] | │ └ 1536 [openwebui] | └ 'float' [openwebui] | [openwebui] | TypeError: can only concatenate str (not "list") to str ``` IndexError (using `RAG_EMBEDDING_OPENAI_BATCH_SIZE` > 1): ``` [openwebui] | 2025-10-24 15:23:34.532 | INFO | open_webui.routers.retrieval:save_docs_to_vector_db:1300 - save_docs_to_vector_db: document <redacted>.pdf file-a799a550-0adb-4691-9575-6ccad6d286fa [openwebui] | 2025-10-24 15:23:34.805 | INFO | open_webui.routers.retrieval:save_docs_to_vector_db:1416 - generating embeddings for file-a799a550-0adb-4691-9575-6ccad6d286fa [litellm] | INFO: 127.0.0.1:37126 - "GET /health/liveliness HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58760 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58768 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58782 - "POST /v1/embeddings HTTP/1.1" 200 OK [openwebui] | 2025-10-24 15:23:37.011 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.127.1:26177 - "GET /_app/version.json HTTP/1.1" 200 [litellm] | INFO: 10.89.0.2:58792 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58794 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58802 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58814 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58822 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58824 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58838 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58850 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58854 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58858 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58868 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:58884 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:49970 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:49984 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:49988 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:49996 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:50010 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:50024 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:50028 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:50034 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:50046 - "POST /v1/embeddings HTTP/1.1" 200 OK [litellm] | INFO: 10.89.0.2:50048 - "POST /v1/embeddings HTTP/1.1" 200 OK [openwebui] | 2025-10-24 15:23:47.046 | INFO | open_webui.routers.retrieval:save_docs_to_vector_db:1452 - embeddings generated 25 for 73 items [openwebui] | 2025-10-24 15:23:47.049 | ERROR | open_webui.routers.retrieval:save_docs_to_vector_db:1473 - list index out of range [openwebui] | Traceback (most recent call last): [openwebui] | [openwebui] | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap [openwebui] | self._bootstrap_inner() [openwebui] | │ └ <function Thread._bootstrap_inner at 0xffffa1d149a0> [openwebui] | └ <WorkerThread(AnyIO worker thread, started 281469507457440)> [openwebui] | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner [openwebui] | self.run() [openwebui] | │ └ <function WorkerThread.run at 0xffff33099940> [openwebui] | └ <WorkerThread(AnyIO worker thread, started 281469507457440)> [openwebui] | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 976, in run [openwebui] | result = context.run(func, *args) [openwebui] | │ │ │ └ () [openwebui] | │ │ └ functools.partial(<function process_uploaded_file at 0xffff64f4df80>, <starlette.requests.Request object at 0xffff32d7d590>, ... [openwebui] | │ └ <method 'run' of '_contextvars.Context' objects> [openwebui] | └ <_contextvars.Context object at 0xfffef207fe80> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/files.py", line 117, in process_uploaded_file [openwebui] | process_file(request, ProcessFileForm(file_id=file_item.id), user=user) [openwebui] | │ │ │ │ │ └ UserModel(id='112cffcb-4ea1-4818-9fd1-91d309b17d34',... [openwebui] | │ │ │ │ └ 'a799a550-0adb-4691-9575-6ccad6d286fa' [openwebui] | │ │ │ └ FileModel(id='a799a550-0adb-4691-9575-6ccad6d286fa', user_id='112cffcb-4ea1-4818-9fd1-91d309b17d34', hash=None, filename='FRS... [openwebui] | │ │ └ <class 'open_webui.routers.retrieval.ProcessFileForm'> [openwebui] | │ └ <starlette.requests.Request object at 0xffff32d7d590> [openwebui] | └ <function process_file at 0xffff604aca40> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1656, in process_file [openwebui] | result = save_docs_to_vector_db( [openwebui] | └ <function save_docs_to_vector_db at 0xffff6044bba0> [openwebui] | [openwebui] | > File "/app/backend/open_webui/routers/retrieval.py", line 1454, in save_docs_to_vector_db [openwebui] | items = [ [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1458, in <listcomp> [openwebui] | "vector": embeddings[idx], [openwebui] | │ └ 25 [openwebui] | └ ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float'... [openwebui] | [openwebui] | IndexError: list index out of range [openwebui] | 2025-10-24 15:23:47.058 | ERROR | open_webui.routers.retrieval:process_file:1695 - list index out of range [openwebui] | Traceback (most recent call last): [openwebui] | [openwebui] | File "/usr/local/lib/python3.11/threading.py", line 1002, in _bootstrap [openwebui] | self._bootstrap_inner() [openwebui] | │ └ <function Thread._bootstrap_inner at 0xffffa1d149a0> [openwebui] | └ <WorkerThread(AnyIO worker thread, started 281469507457440)> [openwebui] | File "/usr/local/lib/python3.11/threading.py", line 1045, in _bootstrap_inner [openwebui] | self.run() [openwebui] | │ └ <function WorkerThread.run at 0xffff33099940> [openwebui] | └ <WorkerThread(AnyIO worker thread, started 281469507457440)> [openwebui] | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 976, in run [openwebui] | result = context.run(func, *args) [openwebui] | │ │ │ └ () [openwebui] | │ │ └ functools.partial(<function process_uploaded_file at 0xffff64f4df80>, <starlette.requests.Request object at 0xffff32d7d590>, ... [openwebui] | │ └ <method 'run' of '_contextvars.Context' objects> [openwebui] | └ <_contextvars.Context object at 0xfffef207fe80> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/files.py", line 117, in process_uploaded_file [openwebui] | process_file(request, ProcessFileForm(file_id=file_item.id), user=user) [openwebui] | │ │ │ │ │ └ UserModel(id='112cffcb-4ea1-4818-9fd1-91d309b17d34',... [openwebui] | │ │ │ │ └ 'a799a550-0adb-4691-9575-6ccad6d286fa' [openwebui] | │ │ │ └ FileModel(id='a799a550-0adb-4691-9575-6ccad6d286fa', user_id='112cffcb-4ea1-4818-9fd1-91d309b17d34', hash=None, filename='FRS... [openwebui] | │ │ └ <class 'open_webui.routers.retrieval.ProcessFileForm'> [openwebui] | │ └ <starlette.requests.Request object at 0xffff32d7d590> [openwebui] | └ <function process_file at 0xffff604aca40> [openwebui] | [openwebui] | > File "/app/backend/open_webui/routers/retrieval.py", line 1692, in process_file [openwebui] | raise e [openwebui] | └ IndexError('list index out of range') [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1656, in process_file [openwebui] | result = save_docs_to_vector_db( [openwebui] | └ <function save_docs_to_vector_db at 0xffff6044bba0> [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1474, in save_docs_to_vector_db [openwebui] | raise e [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1454, in save_docs_to_vector_db [openwebui] | items = [ [openwebui] | [openwebui] | File "/app/backend/open_webui/routers/retrieval.py", line 1458, in <listcomp> [openwebui] | "vector": embeddings[idx], [openwebui] | │ └ 25 [openwebui] | └ ['float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float', 'float'... [openwebui] | [openwebui] | IndexError: list index out of range [openwebui] | 2025-10-24 15:23:47.209 | ERROR | open_webui.routers.files:process_uploaded_file:124 - Error processing file: a799a550-0adb-4691-9575-6ccad6d286fa ``` ### Additional Information - Within our setup, we're using Models hosted on AWS Bedrock and consume them via LiteLLM.
GiteaMirror added the bug label 2026-05-05 20:50:31 -05:00
Author
Owner

@tjbck commented on GitHub (Oct 26, 2025):

Did you change your embedding model?

<!-- gh-comment-id:3448046193 --> @tjbck commented on GitHub (Oct 26, 2025): Did you change your embedding model?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#57312