[GH-ISSUE #215] Function calling #88

Closed
opened 2026-04-12 09:37:51 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @nathanleclaire on GitHub (Jul 25, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/215

Trying to get structured/consistent responses out of LLMs can be pretty brutal

OpenAI recently rolled out Function Calling to get the models to stick to pre-defined schemas

it would be excellent if you could specify something like this (ins/outs) in modelfile

FROM llama

INPUT sentence string

ENUM Sentiment ["good", "bad", "neutral"]

OUTPUT classification Sentiment

PROMPT """
You are skilled at detecting tone in user comments.
Classify the following comment:

${sentence}
"""

then something like:

$ ollama run sentiment "ClosedAI has no moat"
bad

(or better yet, with API :) )

Originally created by @nathanleclaire on GitHub (Jul 25, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/215 Trying to get structured/consistent responses out of LLMs can be pretty brutal OpenAI recently rolled out [Function Calling](https://openai.com/blog/function-calling-and-other-api-updates) to get the models to stick to pre-defined schemas it would be excellent if you could specify something like this (ins/outs) in modelfile ``` FROM llama INPUT sentence string ENUM Sentiment ["good", "bad", "neutral"] OUTPUT classification Sentiment PROMPT """ You are skilled at detecting tone in user comments. Classify the following comment: ${sentence} """ ``` then something like: ``` $ ollama run sentiment "ClosedAI has no moat" bad ``` (or better yet, with API :) )
GiteaMirror added the feature request label 2026-04-12 09:37:51 -05:00
Author
Owner

@jmorganca commented on GitHub (Jul 27, 2023):

@nathanleclaire nice! Have you seen this? https://github.com/ggerganov/llama.cpp/pull/1773

<!-- gh-comment-id:1653785533 --> @jmorganca commented on GitHub (Jul 27, 2023): @nathanleclaire nice! Have you seen this? https://github.com/ggerganov/llama.cpp/pull/1773
Author
Owner

@nathanleclaire commented on GitHub (Jul 27, 2023):

@nathanleclaire nice! Have you seen this? ggerganov/llama.cpp#1773

I had not! That's dope!!

<!-- gh-comment-id:1654549609 --> @nathanleclaire commented on GitHub (Jul 27, 2023): > @nathanleclaire nice! Have you seen this? [ggerganov/llama.cpp#1773](https://github.com/ggerganov/llama.cpp/pull/1773) I had not! That's dope!!
Author
Owner

@technovangelist commented on GitHub (Dec 4, 2023):

A few weeks back we added format: json which solves most of the points here. We can specify that it should be output as json and we can specify the schema and types to be used. Its not at the modelfile level but can be applied to any model, either through the API or at the CLI with ollama run --format json or in the repl with set format json.

I think this solves your original request. As such I will close the issue. If you think its not solved, let us know what else is needed by reopening the issue. Thanks

<!-- gh-comment-id:1839270628 --> @technovangelist commented on GitHub (Dec 4, 2023): A few weeks back we added `format: json` which solves most of the points here. We can specify that it should be output as json and we can specify the schema and types to be used. Its not at the modelfile level but can be applied to any model, either through the API or at the CLI with `ollama run --format json` or in the repl with `set format json`. I think this solves your original request. As such I will close the issue. If you think its not solved, let us know what else is needed by reopening the issue. Thanks
Author
Owner

@nathanleclaire commented on GitHub (Dec 7, 2023):

dooooope!

<!-- gh-comment-id:1845519496 --> @nathanleclaire commented on GitHub (Dec 7, 2023): dooooope!
Author
Owner

@technovangelist commented on GitHub (Dec 9, 2023):

YES!!! That’s the response I hoped for.

<!-- gh-comment-id:1848087071 --> @technovangelist commented on GitHub (Dec 9, 2023): YES!!! That’s the response I hoped for.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#88