[GH-ISSUE #13710] ollama._types.ResponseError: error parsing tool call #71050

Closed
opened 2026-05-04 23:51:14 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @khteh on GitHub (Jan 14, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/13710

What is the issue?

During langchain deepagent execution, it hits exception.

Relevant log output

================================== Ai Message ==================================
Tool Calls:
  task (8f84c121-98b9-4254-bb93-325c7fb7991b)
 Call ID: 8f84c121-98b9-4254-bb93-325c7fb7991b
  Args:
    description: Research task decomposition, standard method, and common extensions. Provide concise findings with citations.
    subagent_type: RAG Sub-Agent
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/usr/src/Python/rag-agent/src/rag_agent/RAGAgent.py", line 251, in <module>
    asyncio.run(main())
    ~~~~~~~~~~~^^^^^^^^
  File "/usr/lib/python3.13/asyncio/runners.py", line 195, in run
    return runner.run(main)
           ~~~~~~~~~~^^^^^^
  File "/usr/lib/python3.13/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
  File "/usr/lib/python3.13/asyncio/base_events.py", line 725, in run_until_complete
    return future.result()
           ~~~~~~~~~~~~~^^
  File "/usr/src/Python/rag-agent/src/rag_agent/RAGAgent.py", line 248, in main
    await rag.ChatAgent(config, input_message)
  File "/usr/src/Python/rag-agent/src/rag_agent/RAGAgent.py", line 190, in ChatAgent
    async for step in self._agent.astream(
    ...<5 lines>...
        step["messages"][-1].pretty_print()
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/main.py", line 2974, in astream
    async for _ in runner.atick(
    ...<13 lines>...
            yield o
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/_runner.py", line 304, in atick
    await arun_with_retry(
    ...<15 lines>...
    )
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/_retry.py", line 138, in arun_with_retry
    return await task.proc.ainvoke(task.input, config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 705, in ainvoke
    input = await asyncio.create_task(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^
        step.ainvoke(input, config, **kwargs), context=context
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 473, in ainvoke
    ret = await self.afunc(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 845, in _afunc
    outputs = await asyncio.gather(*coros)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1199, in _arun_one
    content = _handle_tool_error(e, flag=self._handle_tool_errors)
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 429, in _handle_tool_error
    content = flag(e)  # type: ignore [assignment, call-arg]
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 386, in _default_handle_tool_errors
    raise e
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1190, in _arun_one
    return await self._awrap_tool_call(tool_request, execute)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/deepagents/middleware/filesystem.py", line 1143, in awrap_tool_call
    tool_result = await handler(request)
                  ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1181, in execute
    return await self._execute_tool_async(req, input_type, config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1125, in _execute_tool_async
    content = _handle_tool_error(e, flag=self._handle_tool_errors)
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 429, in _handle_tool_error
    content = flag(e)  # type: ignore [assignment, call-arg]
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 386, in _default_handle_tool_errors
    raise e
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1082, in _execute_tool_async
    response = await tool.ainvoke(call_args, config)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/tools/structured.py", line 67, in ainvoke
    return await super().ainvoke(input, config, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/tools/base.py", line 642, in ainvoke
    return await self.arun(tool_input, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/tools/base.py", line 1117, in arun
    raise error_to_raise
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/tools/base.py", line 1083, in arun
    response = await coro_with_context(coro, context)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/tools/structured.py", line 121, in _arun
    return await self.coroutine(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/deepagents/middleware/subagents.py", line 370, in atask
    result = await subagent.ainvoke(subagent_state, runtime.config)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/main.py", line 3161, in ainvoke
    async for chunk in self.astream(
    ...<29 lines>...
            chunks.append(chunk)
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/main.py", line 2974, in astream
    async for _ in runner.atick(
    ...<13 lines>...
            yield o
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/_runner.py", line 304, in atick
    await arun_with_retry(
    ...<15 lines>...
    )
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/_retry.py", line 138, in arun_with_retry
    return await task.proc.ainvoke(task.input, config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 705, in ainvoke
    input = await asyncio.create_task(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^
        step.ainvoke(input, config, **kwargs), context=context
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 473, in ainvoke
    ret = await self.afunc(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 1185, in amodel_node
    response = await _execute_model_async(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 1156, in _execute_model_async
    output = await model_.ainvoke(messages)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/runnables/base.py", line 5570, in ainvoke
    return await self.bound.ainvoke(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<3 lines>...
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 425, in ainvoke
    llm_result = await self.agenerate_prompt(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<8 lines>...
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1132, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
        prompt_messages, stop=stop, callbacks=callbacks, **kwargs
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1090, in agenerate
    raise exceptions[0]
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1359, in _agenerate_with_cache
    result = await self._agenerate(
             ^^^^^^^^^^^^^^^^^^^^^^
        messages, stop=stop, run_manager=run_manager, **kwargs
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 1208, in _agenerate
    final_chunk = await self._achat_stream_with_aggregation(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        messages, stop, run_manager, verbose=self.verbose, **kwargs
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 991, in _achat_stream_with_aggregation
    async for chunk in self._aiterate_over_stream(messages, stop, **kwargs):
    ...<9 lines>...
            )
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 1131, in _aiterate_over_stream
    async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
    ...<52 lines>...
            yield chunk
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 937, in _acreate_chat_stream
    async for part in await self._async_client.chat(**chat_params):
        yield part
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/ollama/_client.py", line 762, in inner
    raise ResponseError(err)
ollama._types.ResponseError: error parsing tool call: raw='{"reflection":"I retrieved a source that mentions LLM+P and PDDL. The source ID is not clear but appears to be from Lilian Weng's blog. The snippet includes the LLM+P description. I have chain of thought, tree of thoughts, LLM+P. I need to provide concise findings with citations. I have enough info. I should cite [26] for chain of thought and tree of thoughts, and the Lilian Weng source for LLM+P. I should also maybe cite a source for standard method of task decomposition via simple prompting. That might be in the same Lilian Weng article. I think I have enough. I'll proceed to answer.', err=unexpected end of JSON input (status code: -1)
During task with name 'model' and id '0f9f613a-9230-9444-40e8-72790b5ac801'
During task with name 'tools' and id '7ddb24a4-b5d9-2f88-c9eb-67a9522be97e'

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.13.5

Originally created by @khteh on GitHub (Jan 14, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/13710 ### What is the issue? During langchain deepagent execution, it hits exception. ### Relevant log output ```shell ================================== Ai Message ================================== Tool Calls: task (8f84c121-98b9-4254-bb93-325c7fb7991b) Call ID: 8f84c121-98b9-4254-bb93-325c7fb7991b Args: description: Research task decomposition, standard method, and common extensions. Provide concise findings with citations. subagent_type: RAG Sub-Agent Traceback (most recent call last): File "<frozen runpy>", line 198, in _run_module_as_main File "<frozen runpy>", line 88, in _run_code File "/usr/src/Python/rag-agent/src/rag_agent/RAGAgent.py", line 251, in <module> asyncio.run(main()) ~~~~~~~~~~~^^^^^^^^ File "/usr/lib/python3.13/asyncio/runners.py", line 195, in run return runner.run(main) ~~~~~~~~~~^^^^^^ File "/usr/lib/python3.13/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^ File "/usr/lib/python3.13/asyncio/base_events.py", line 725, in run_until_complete return future.result() ~~~~~~~~~~~~~^^ File "/usr/src/Python/rag-agent/src/rag_agent/RAGAgent.py", line 248, in main await rag.ChatAgent(config, input_message) File "/usr/src/Python/rag-agent/src/rag_agent/RAGAgent.py", line 190, in ChatAgent async for step in self._agent.astream( ...<5 lines>... step["messages"][-1].pretty_print() File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/main.py", line 2974, in astream async for _ in runner.atick( ...<13 lines>... yield o File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/_runner.py", line 304, in atick await arun_with_retry( ...<15 lines>... ) File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/_retry.py", line 138, in arun_with_retry return await task.proc.ainvoke(task.input, config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 705, in ainvoke input = await asyncio.create_task( ^^^^^^^^^^^^^^^^^^^^^^^^^^ step.ainvoke(input, config, **kwargs), context=context ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 473, in ainvoke ret = await self.afunc(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 845, in _afunc outputs = await asyncio.gather(*coros) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1199, in _arun_one content = _handle_tool_error(e, flag=self._handle_tool_errors) File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 429, in _handle_tool_error content = flag(e) # type: ignore [assignment, call-arg] File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 386, in _default_handle_tool_errors raise e File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1190, in _arun_one return await self._awrap_tool_call(tool_request, execute) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/deepagents/middleware/filesystem.py", line 1143, in awrap_tool_call tool_result = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1181, in execute return await self._execute_tool_async(req, input_type, config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1125, in _execute_tool_async content = _handle_tool_error(e, flag=self._handle_tool_errors) File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 429, in _handle_tool_error content = flag(e) # type: ignore [assignment, call-arg] File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 386, in _default_handle_tool_errors raise e File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/prebuilt/tool_node.py", line 1082, in _execute_tool_async response = await tool.ainvoke(call_args, config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/tools/structured.py", line 67, in ainvoke return await super().ainvoke(input, config, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/tools/base.py", line 642, in ainvoke return await self.arun(tool_input, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/tools/base.py", line 1117, in arun raise error_to_raise File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/tools/base.py", line 1083, in arun response = await coro_with_context(coro, context) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/tools/structured.py", line 121, in _arun return await self.coroutine(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/deepagents/middleware/subagents.py", line 370, in atask result = await subagent.ainvoke(subagent_state, runtime.config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/main.py", line 3161, in ainvoke async for chunk in self.astream( ...<29 lines>... chunks.append(chunk) File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/main.py", line 2974, in astream async for _ in runner.atick( ...<13 lines>... yield o File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/_runner.py", line 304, in atick await arun_with_retry( ...<15 lines>... ) File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/_retry.py", line 138, in arun_with_retry return await task.proc.ainvoke(task.input, config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 705, in ainvoke input = await asyncio.create_task( ^^^^^^^^^^^^^^^^^^^^^^^^^^ step.ainvoke(input, config, **kwargs), context=context ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 473, in ainvoke ret = await self.afunc(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 1185, in amodel_node response = await _execute_model_async(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 1156, in _execute_model_async output = await model_.ainvoke(messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/runnables/base.py", line 5570, in ainvoke return await self.bound.ainvoke( ^^^^^^^^^^^^^^^^^^^^^^^^^ ...<3 lines>... ) ^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 425, in ainvoke llm_result = await self.agenerate_prompt( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<8 lines>... ) ^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1132, in agenerate_prompt return await self.agenerate( ^^^^^^^^^^^^^^^^^^^^^ prompt_messages, stop=stop, callbacks=callbacks, **kwargs ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1090, in agenerate raise exceptions[0] File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1359, in _agenerate_with_cache result = await self._agenerate( ^^^^^^^^^^^^^^^^^^^^^^ messages, stop=stop, run_manager=run_manager, **kwargs ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 1208, in _agenerate final_chunk = await self._achat_stream_with_aggregation( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ messages, stop, run_manager, verbose=self.verbose, **kwargs ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 991, in _achat_stream_with_aggregation async for chunk in self._aiterate_over_stream(messages, stop, **kwargs): ...<9 lines>... ) File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 1131, in _aiterate_over_stream async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs): ...<52 lines>... yield chunk File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 937, in _acreate_chat_stream async for part in await self._async_client.chat(**chat_params): yield part File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/ollama/_client.py", line 762, in inner raise ResponseError(err) ollama._types.ResponseError: error parsing tool call: raw='{"reflection":"I retrieved a source that mentions LLM+P and PDDL. The source ID is not clear but appears to be from Lilian Weng's blog. The snippet includes the LLM+P description. I have chain of thought, tree of thoughts, LLM+P. I need to provide concise findings with citations. I have enough info. I should cite [26] for chain of thought and tree of thoughts, and the Lilian Weng source for LLM+P. I should also maybe cite a source for standard method of task decomposition via simple prompting. That might be in the same Lilian Weng article. I think I have enough. I'll proceed to answer.', err=unexpected end of JSON input (status code: -1) During task with name 'model' and id '0f9f613a-9230-9444-40e8-72790b5ac801' During task with name 'tools' and id '7ddb24a4-b5d9-2f88-c9eb-67a9522be97e' ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.13.5
GiteaMirror added the bug label 2026-05-04 23:51:14 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71050