[GH-ISSUE #3915] ChatOllama does not support the FunctionMessage message type #2427

Closed
opened 2026-04-12 12:44:15 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @solarslurpi on GitHub (Apr 25, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3915

What is the issue?

When using LangGraph, the FunctionMessage is used. This is not supported in

from langchain_experimental.llms.ollama_functions import OllamaFunctions
model = OllamaFunctions(model="llama3")

for example a method in the ChatOllama class:

    def _convert_messages_to_ollama_messages(
        self, messages: List[BaseMessage]
    ) -> List[Dict[str, Union[str, List[str]]]]:
        ollama_messages: List = []
        for message in messages:
            role = ""
            if isinstance(message, HumanMessage):
                role = "user"
            elif isinstance(message, AIMessage):
                role = "assistant"
            elif isinstance(message, SystemMessage):
                role = "system"
            else:
                raise ValueError("Received unsupported message type for Ollama.")

The challenge is not supporting blocks use of LangGraph.

OS

Windows

GPU

Nvidia

CPU

AMD

Ollama version

0.1.31

Originally created by @solarslurpi on GitHub (Apr 25, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3915 ### What is the issue? When using LangGraph, the FunctionMessage is used. This is not supported in ``` from langchain_experimental.llms.ollama_functions import OllamaFunctions model = OllamaFunctions(model="llama3") ``` for example a method in the `ChatOllama` class: ``` def _convert_messages_to_ollama_messages( self, messages: List[BaseMessage] ) -> List[Dict[str, Union[str, List[str]]]]: ollama_messages: List = [] for message in messages: role = "" if isinstance(message, HumanMessage): role = "user" elif isinstance(message, AIMessage): role = "assistant" elif isinstance(message, SystemMessage): role = "system" else: raise ValueError("Received unsupported message type for Ollama.") ``` The challenge is not supporting blocks use of LangGraph. ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.1.31
GiteaMirror added the bug label 2026-04-12 12:44:15 -05:00
Author
Owner

@baswenneker commented on GitHub (Apr 26, 2024):

I have the same issue, but shouldn't this be filed in the LangChain repo?

<!-- gh-comment-id:2078667716 --> @baswenneker commented on GitHub (Apr 26, 2024): I have the same issue, but shouldn't this be filed in the LangChain repo?
Author
Owner

@solarslurpi commented on GitHub (Apr 26, 2024):

Good question! Argh.

<!-- gh-comment-id:2079216165 --> @solarslurpi commented on GitHub (Apr 26, 2024): Good question! Argh.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2427