[GH-ISSUE #791] Sending and receiving Context with ollama.call() #377

Closed
opened 2026-04-12 10:01:23 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @gemini2463 on GitHub (Oct 15, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/791

Is the input and output of context supported yet with ollama.call()?

javascript:

import { Ollama } from "langchain/llms/ollama";
const ollama = new Ollama({
    baseUrl: "http://localhost:11434",
    model: model,
    temperature: parseFloat(temperature),
    topP: parseFloat(topp)
});
response = await ollama.call(input);
Originally created by @gemini2463 on GitHub (Oct 15, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/791 Is the input and output of context supported yet with ollama.call()? javascript: ``` import { Ollama } from "langchain/llms/ollama"; const ollama = new Ollama({ baseUrl: "http://localhost:11434", model: model, temperature: parseFloat(temperature), topP: parseFloat(topp) }); response = await ollama.call(input); ```
Author
Owner

@mxyng commented on GitHub (Oct 16, 2023):

The Ollama langchain integration is maintained by the langchain team. Perhaps you can create an issue in https://github.com/langchain-ai/langchain?

<!-- gh-comment-id:1765005582 --> @mxyng commented on GitHub (Oct 16, 2023): The Ollama langchain integration is maintained by the langchain team. Perhaps you can create an issue in https://github.com/langchain-ai/langchain?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#377