[GH-ISSUE #280] Non-interactive mode for batching inputs #122

Closed
opened 2026-04-12 09:39:21 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @jmthackett on GitHub (Aug 4, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/280

Just something along these lines:

ollama run <my model> -f input.txt -n <number of runs> -o output.txt

Not essential by any stretch of the imagination but it'd be handy. My use case is being able to batch process prompts by just iterating over a list of text files. At the moment I'm just looking at how to wrap it all up in bash - probably by piping to stdin - but it isn't the easiest thing to know when it has returned.

Originally created by @jmthackett on GitHub (Aug 4, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/280 Just something along these lines: ``` ollama run <my model> -f input.txt -n <number of runs> -o output.txt ``` Not essential by any stretch of the imagination but it'd be handy. My use case is being able to batch process prompts by just iterating over a list of text files. At the moment I'm just looking at how to wrap it all up in bash - probably by piping to stdin - but it isn't the easiest thing to know when it has returned.
GiteaMirror added the feature request label 2026-04-12 09:39:21 -05:00
Author
Owner

@mxyng commented on GitHub (Aug 7, 2023):

@jmthackett can you elaborate on the behaviour when ingesting the input file? Is each line a separate prompt or is the entire file expected to be a single prompt?

Based on the issue, a simple script can be the following. Once all runs and all files are processed, the script will return/exit.

Note this will ingest each line in each file as a separate prompt. If you want to ingest the entire file, replace <$FILE with $(cat $FILE)

for FILE in FILES; do
  for N in $(seq <number of runs>); do
    ollama run <my model> <$FILE >$FILE-$N-output.txt
  done
done
<!-- gh-comment-id:1668275299 --> @mxyng commented on GitHub (Aug 7, 2023): @jmthackett can you elaborate on the behaviour when ingesting the input file? Is each line a separate prompt or is the entire file expected to be a single prompt? Based on the issue, a simple script can be the following. Once all runs and all files are processed, the script will return/exit. Note this will ingest each line in each file as a separate prompt. If you want to ingest the entire file, replace `<$FILE` with `$(cat $FILE)` ```bash for FILE in FILES; do for N in $(seq <number of runs>); do ollama run <my model> <$FILE >$FILE-$N-output.txt done done ```
Author
Owner

@technovangelist commented on GitHub (Dec 4, 2023):

It looks like your issue was solved by Mike's script above. I'll go ahead and close this issue but if you think it isn't solved, feel free to reopen. Thanks for being part of this great community.

<!-- gh-comment-id:1839295105 --> @technovangelist commented on GitHub (Dec 4, 2023): It looks like your issue was solved by Mike's script above. I'll go ahead and close this issue but if you think it isn't solved, feel free to reopen. Thanks for being part of this great community.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#122