[GH-ISSUE #1027] How to properly format Advanced Parameters / options in API calls? #501

Closed
opened 2026-04-12 10:11:59 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @tob-har on GitHub (Nov 7, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1027

Originally assigned to: @BruceMacD on GitHub.

API Documentation gives a proper example, how to use
POST /api/generate

But how to properly format the JSON object to use Advanced Parameters?
Especially options and system.

I tried to request the following via POST /api/generate.
Everything behaves as expected, eg stream, but options is not workig:

{ "model": "llama2:latest", "stream": false, "prompt": "Sing a song.", "options": { "temperature": 5} }

Happy about hints!!! Thanks a lot

Originally created by @tob-har on GitHub (Nov 7, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1027 Originally assigned to: @BruceMacD on GitHub. API Documentation gives a proper example, how to use `POST /api/generate` But how to properly format the JSON object to use Advanced Parameters? Especially `options` and `system`. I tried to request the following via `POST /api/generate`. Everything behaves as expected, eg stream, but options is not workig: ` { "model": "llama2:latest", "stream": false, "prompt": "Sing a song.", "options": { "temperature": 5} } ` Happy about hints!!! Thanks a lot
Author
Owner

@BruceMacD commented on GitHub (Nov 7, 2023):

Hi @tob-har your request specifying system and temperature will look something like this:

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "llama2:latest",
  "prompt": "Sing a song",
  "system": "You are Beyoncé",
  "stream": false,
  "options": {
    "temperature": 0.8
    }
}'
<!-- gh-comment-id:1799590012 --> @BruceMacD commented on GitHub (Nov 7, 2023): Hi @tob-har your request specifying system and temperature will look something like this: ``` curl -X POST http://localhost:11434/api/generate -d '{ "model": "llama2:latest", "prompt": "Sing a song", "system": "You are Beyoncé", "stream": false, "options": { "temperature": 0.8 } }' ```
Author
Owner

@tob-har commented on GitHub (Nov 7, 2023):

thanks @BruceMacD ! Yes, that I go so far.
but setting advanced parameters like this as object as value of a key:value pair has no effect:

"options": {
"temperature": 0.8
}

even when using super extrem valus vor top_k or top_p...

But I also dont get errors... so I assume, it is somehow ignored when processing the request?!?
Maybe do the parameters need to be set within the Modelfile befor when creating a model to be overwritable?

<!-- gh-comment-id:1799832124 --> @tob-har commented on GitHub (Nov 7, 2023): thanks @BruceMacD ! Yes, that I go so far. but setting advanced parameters like this as object as value of a key:value pair has no effect: "options": { "temperature": 0.8 } even when using super extrem valus vor top_k or top_p... But I also dont get errors... so I assume, it is somehow ignored when processing the request?!? Maybe do the parameters need to be set within the Modelfile befor when creating a model to be overwritable?
Author
Owner

@BruceMacD commented on GitHub (Nov 8, 2023):

I just tested the options and they are properly passed to the LLM in the most recent version. What behavior do you expect to see?

<!-- gh-comment-id:1802469705 --> @BruceMacD commented on GitHub (Nov 8, 2023): I just tested the options and they are properly passed to the LLM in the most recent version. What behavior do you expect to see?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#501