[GH-ISSUE #10176] Switched to nomic-embed-text model but still get 8192 dimension #6676

Closed
opened 2026-04-12 18:24:13 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @khteh on GitHub (Apr 8, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10176

What is the issue?

https://github.com/ollama/ollama/issues/10149

OllamaEmbeddings(model=config.EMBEDDING_MODEL, base_url=config.OLLAMA_URI, num_ctx=8192, num_gpu=1, temperature=0)

Relevant log output

ids = await self.vector_store.aadd_documents(documents = unique_docs, ids = unique_ids)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/langchain_core/vectorstores/base.py", line 323, in aadd_documents
    return await run_in_executor(None, self.add_documents, documents, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/langchain_core/runnables/config.py", line 616, in run_in_executor
    return await asyncio.get_running_loop().run_in_executor(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/langchain_core/runnables/config.py", line 607, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/langchain_core/vectorstores/base.py", line 287, in add_documents
    return self.add_texts(texts, metadatas, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/langchain_chroma/vectorstores.py", line 556, in add_texts
    self._collection.upsert(
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/api/models/Collection.py", line 344, in upsert
    self._client._upsert(
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/telemetry/opentelemetry/__init__.py", line 150, in wrapper
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/api/fastapi.py", line 537, in _upsert
    self._submit_batch(
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/telemetry/opentelemetry/__init__.py", line 150, in wrapper
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/api/fastapi.py", line 436, in _submit_batch
    self._make_request(
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/api/fastapi.py", line 90, in _make_request
    BaseHTTPClient._raise_chroma_error(response)
  File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/api/base_http_client.py", line 96, in _raise_chroma_error
    raise chroma_error
chromadb.errors.InvalidArgumentError: Collection expecting embedding with dimension of 8192, got 768
root@ollama-0:/# ollama --version
ollama version is 0.6.2
root@ollama-0:/# ollama show nomic-embed-text
  Model
    architecture        nomic-bert    
    parameters          136.73M       
    context length      2048          
    embedding length    768           
    quantization        F16           

  Parameters
    num_ctx    8192    

  License
    Apache License               
    Version 2.0, January 2004

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.6.2

Originally created by @khteh on GitHub (Apr 8, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10176 ### What is the issue? https://github.com/ollama/ollama/issues/10149 ``` OllamaEmbeddings(model=config.EMBEDDING_MODEL, base_url=config.OLLAMA_URI, num_ctx=8192, num_gpu=1, temperature=0) ``` ### Relevant log output ```shell ids = await self.vector_store.aadd_documents(documents = unique_docs, ids = unique_ids) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/langchain_core/vectorstores/base.py", line 323, in aadd_documents return await run_in_executor(None, self.add_documents, documents, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/langchain_core/runnables/config.py", line 616, in run_in_executor return await asyncio.get_running_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/langchain_core/runnables/config.py", line 607, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/langchain_core/vectorstores/base.py", line 287, in add_documents return self.add_texts(texts, metadatas, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/langchain_chroma/vectorstores.py", line 556, in add_texts self._collection.upsert( File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/api/models/Collection.py", line 344, in upsert self._client._upsert( File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/telemetry/opentelemetry/__init__.py", line 150, in wrapper return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/api/fastapi.py", line 537, in _upsert self._submit_batch( File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/telemetry/opentelemetry/__init__.py", line 150, in wrapper return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/api/fastapi.py", line 436, in _submit_batch self._make_request( File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/api/fastapi.py", line 90, in _make_request BaseHTTPClient._raise_chroma_error(response) File "/home/khteh/.local/share/virtualenvs/rag-agent-YeW3dxEa/lib/python3.12/site-packages/chromadb/api/base_http_client.py", line 96, in _raise_chroma_error raise chroma_error chromadb.errors.InvalidArgumentError: Collection expecting embedding with dimension of 8192, got 768 ``` ``` root@ollama-0:/# ollama --version ollama version is 0.6.2 root@ollama-0:/# ollama show nomic-embed-text Model architecture nomic-bert parameters 136.73M context length 2048 embedding length 768 quantization F16 Parameters num_ctx 8192 License Apache License Version 2.0, January 2004 ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.6.2
GiteaMirror added the bug label 2026-04-12 18:24:13 -05:00
Author
Owner

@rossbg commented on GitHub (Apr 8, 2025):

as per the error message, you need to adjust your vector field dimenson size in chroma db

<!-- gh-comment-id:2785442139 --> @rossbg commented on GitHub (Apr 8, 2025): as per the error message, you need to adjust your vector field dimenson size in chroma db
Author
Owner

@khteh commented on GitHub (Apr 8, 2025):

Any idea how to do that?

<!-- gh-comment-id:2785555795 --> @khteh commented on GitHub (Apr 8, 2025): Any idea how to do that?
Author
Owner

@khteh commented on GitHub (Apr 8, 2025):

My mistake. I restarted the k8s chroma STS but I missed the line which deleted the PVC.

<!-- gh-comment-id:2785786105 --> @khteh commented on GitHub (Apr 8, 2025): My mistake. I restarted the k8s chroma STS but I missed the line which deleted the PVC.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6676