mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-11 00:04:08 -05:00
OllamaLLM fails to connect to custom Ollama server after migration from langchain_community #1928
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @fmdelgado on GitHub (Aug 29, 2024).
Description
I'm experiencing connection issues when trying to use
OllamaLLMfromlangchain_ollamato connect to a custom Ollama server. This setup previously worked correctly when usingOllamafromlangchain_community.llms.Current Behavior
When attempting to use
ollama_llm.invoke(), I receive aConnectError: [Errno 61] Connection refusederror.Expected Behavior
The
OllamaLLMclass should successfully connect to the custom Ollama server and allow me to generate responses using theinvoke()method, as it did when usinglangchain_community.llms.Ollama.Steps to Reproduce
OllamaLLMwith the appropriate parameters.ollama_llm.invoke()to generate a response.Code Sample
Additional Context
langchain_community.llms.Ollama.Environment
Possible Solution
It seems that the
OllamaLLMclass might be expecting a different API structure or endpoint compared to the previous implementation. Could there be any changes in howOllamaLLMconstructs API requests compared to the previousOllamaclass?Question
Is there any additional configuration or setup required when migrating from
langchain_community.llms.Ollamatolangchain_ollama.OllamaLLMfor custom Ollama servers?