[PR #615] [CLOSED] add ollama run flags: template, context, stop #36133

Closed
opened 2026-04-22 20:50:41 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/615
Author: @sqs
Created: 9/27/2023
Status: Closed

Base: mainHead: run-opts


📝 Commits (1)

  • 4b15b5c add ollama run flags: template, context, stop

📊 Changes

1 file changed (+19 additions, -1 deletions)

View changed files

📝 cmd/cmd.go (+19 -1)

📄 Description

These new ollama run flags make ollama run useful for debugging more advanced invocations of the Ollama generate API.

For example, the following command generates completions with context tokens for const primes=[1,2,3,5,7, a stop sequence (;), and a custom template:

ollama run --verbose --context 3075,544,1355,353,518,29896,29892,29906,29892,29941,29892,29945,29892,29955,29892 --template '{{.Prompt}}' --stop ';' codellama:7b-code ''

You can accomplish something similar with curl and the Ollama API, but it is easier to use the ollama run CLI and then you get the nice verbose timings output as well in an easy-to-consume form.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/615 **Author:** [@sqs](https://github.com/sqs) **Created:** 9/27/2023 **Status:** ❌ Closed **Base:** `main` ← **Head:** `run-opts` --- ### 📝 Commits (1) - [`4b15b5c`](https://github.com/ollama/ollama/commit/4b15b5c46bb0c54763994004e5461ab289315bce) add `ollama run` flags: template, context, stop ### 📊 Changes **1 file changed** (+19 additions, -1 deletions) <details> <summary>View changed files</summary> 📝 `cmd/cmd.go` (+19 -1) </details> ### 📄 Description These new `ollama run` flags make `ollama run` useful for debugging more advanced invocations of the Ollama generate API. For example, the following command generates completions with context tokens for `const primes=[1,2,3,5,7`, a stop sequence (`;`), and a custom template: ``` ollama run --verbose --context 3075,544,1355,353,518,29896,29892,29906,29892,29941,29892,29945,29892,29955,29892 --template '{{.Prompt}}' --stop ';' codellama:7b-code '' ``` You can accomplish something similar with `curl` and the Ollama API, but it is easier to use the `ollama run` CLI and then you get the nice verbose timings output as well in an easy-to-consume form. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-22 20:50:41 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#36133