[GH-ISSUE #436] Ollama embeddings LangChain integration #46715

Closed
opened 2026-04-27 23:39:48 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @jmorganca on GitHub (Aug 28, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/436

It would be great combo to be able to use Ollama as both a model and embeddings back end (i.e. langchain.embeddings.OllamaEmbeddings) together.

Originally created by @jmorganca on GitHub (Aug 28, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/436 It would be great combo to be able to use Ollama as both a model and embeddings back end (i.e. `langchain.embeddings.OllamaEmbeddings`) together.
GiteaMirror added the feature request label 2026-04-27 23:39:48 -05:00
Author
Owner

@yackermann commented on GitHub (Sep 2, 2023):

@jmorganca I done it: https://github.com/langchain-ai/langchain/pull/10124 *)

<!-- gh-comment-id:1703821825 --> @yackermann commented on GitHub (Sep 2, 2023): @jmorganca I done it: https://github.com/langchain-ai/langchain/pull/10124 *)
Author
Owner

@yackermann commented on GitHub (Sep 6, 2023):

Ref: #472

<!-- gh-comment-id:1707458357 --> @yackermann commented on GitHub (Sep 6, 2023): Ref: #472
Author
Owner

@jmorganca commented on GitHub (Sep 22, 2023):

This is completed 🎉

<!-- gh-comment-id:1730765261 --> @jmorganca commented on GitHub (Sep 22, 2023): This is [completed](https://python.langchain.com/docs/integrations/llms/ollama) 🎉
Author
Owner

@Sravani-ytp commented on GitHub (Dec 4, 2023):

@jmorganca ,unable to get the response from OllamaEmbeddings()

<!-- gh-comment-id:1838383903 --> @Sravani-ytp commented on GitHub (Dec 4, 2023): @jmorganca ,unable to get the response from OllamaEmbeddings()
Author
Owner

@yackermann commented on GitHub (Dec 4, 2023):

Whats the error?

On Mon, 4 Dec 2023 at 11:54 PM, Sravani-ytp @.***>
wrote:

@jmorganca https://github.com/jmorganca ,unable to get the response
from OllamaEmbeddings()


Reply to this email directly, view it on GitHub
https://github.com/jmorganca/ollama/issues/436#issuecomment-1838383903,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AAMPOFHRAU4F77UPGJNTH2TYHWTWFAVCNFSM6AAAAAA4BIMZFGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMZYGM4DGOJQGM
.
You are receiving this because you commented.Message ID:
@.***>

<!-- gh-comment-id:1838424039 --> @yackermann commented on GitHub (Dec 4, 2023): Whats the error? On Mon, 4 Dec 2023 at 11:54 PM, Sravani-ytp ***@***.***> wrote: > @jmorganca <https://github.com/jmorganca> ,unable to get the response > from OllamaEmbeddings() > > — > Reply to this email directly, view it on GitHub > <https://github.com/jmorganca/ollama/issues/436#issuecomment-1838383903>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AAMPOFHRAU4F77UPGJNTH2TYHWTWFAVCNFSM6AAAAAA4BIMZFGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMZYGM4DGOJQGM> > . > You are receiving this because you commented.Message ID: > ***@***.***> >
Author
Owner

@Sravani-ytp commented on GitHub (Dec 4, 2023):

loader = TextLoader("a.txt")
text_splitter = RecursiveCharacterTextSplitter(
chunk_size=500, chunk_overlap=10)

docs = text_splitter.split_documents(documents)

web_docs, meta = [], []
splits1 = text_splitter.split_text(str(documents))
web_docs.extend(splits1)
embeddings = OllamaEmbeddings()
query_result =embeddings.embed_query(splits1)
print(query_result[:5])

Here,nothing is printing

<!-- gh-comment-id:1838503419 --> @Sravani-ytp commented on GitHub (Dec 4, 2023): loader = TextLoader("a.txt") text_splitter = RecursiveCharacterTextSplitter( chunk_size=500, chunk_overlap=10) # docs = text_splitter.split_documents(documents) web_docs, meta = [], [] splits1 = text_splitter.split_text(str(documents)) web_docs.extend(splits1) embeddings = OllamaEmbeddings() query_result =embeddings.embed_query(splits1) print(query_result[:5]) Here,nothing is printing
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#46715