[GH-ISSUE #1710] How do we output ollama response to file? #47478

Closed
opened 2026-04-28 03:54:43 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @oliverbob on GitHub (Dec 25, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1710

If Ollama can read prompts from file, there has to be a way somehow to receive response to file and save it in the working directory.

How do I achieve this?

Scenario:

ollama run dolphin-phi

'/home/ai/repo/llama2.c/run.c' rewrite this code with arguments for blah... 😄

Thanks.

Originally created by @oliverbob on GitHub (Dec 25, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1710 If Ollama can read prompts from file, there has to be a way somehow to receive response to file and save it in the working directory. How do I achieve this? Scenario: ollama run dolphin-phi >>> '/home/ai/repo/llama2.c/run.c' rewrite this code with arguments for blah... :smile: Thanks.
Author
Owner

@stmonty commented on GitHub (Dec 26, 2023):

If you're on a Unix-like system (Linux, MacOS) you should be able to just do something like:

ollama run dolphin-phi "prompt" >> response.md

I have tried this using some other models and it worked.

<!-- gh-comment-id:1869237049 --> @stmonty commented on GitHub (Dec 26, 2023): If you're on a Unix-like system (Linux, MacOS) you should be able to just do something like: ``` ollama run dolphin-phi "prompt" >> response.md ``` I have tried this using some other models and it worked.
Author
Owner

@oliverbob commented on GitHub (Dec 26, 2023):

Thanks

If you're on a Unix-like system (Linux, MacOS) you should be able to just do something like:

ollama run dolphin-phi "prompt" >> response.md

I have tried this using some other models and it worked.

This is a big help.

<!-- gh-comment-id:1869420483 --> @oliverbob commented on GitHub (Dec 26, 2023): Thanks > If you're on a Unix-like system (Linux, MacOS) you should be able to just do something like: > > ``` > ollama run dolphin-phi "prompt" >> response.md > ``` > > I have tried this using some other models and it worked. This is a big help.
Author
Owner

@pongnguy commented on GitHub (May 31, 2024):

I get a bunch of control characters. How can I get a clean output?

<!-- gh-comment-id:2142294830 --> @pongnguy commented on GitHub (May 31, 2024): I get a bunch of control characters. How can I get a clean output?
Author
Owner

@shakib04 commented on GitHub (Aug 2, 2024):

i developed simple a bash script with the help of chatgpt for prompting.

#!/bin/bash

# Define the output file
output_file="Response.md"

# # Ensure the output file is empty or create it if it doesn't exist
# > "$output_file"


# Loop until the user decides to exit
while true; do
    # Prompt the user for input
    read -p "Enter something (or type 'exit' to quit): " user_input

    # Check if the user wants to exit
    if [[ "$user_input" == "exit" ]]; then
        echo "Exiting..."
        break
    fi

    # Print the variable
    echo -e "<h2>======  Start of $user_input  ======</h2>" >> "Response.md"

    # Run the command and append the output directly to the file
    ollama run phi3:mini "$user_input" >> "$output_file"
    
    echo -e "<h2>======  End of $user_input  ======</h2>" >> "Response.md"

    # Inform the user that the response has been saved
    echo "The response has been saved to $output_file"

done
<!-- gh-comment-id:2265879121 --> @shakib04 commented on GitHub (Aug 2, 2024): i developed simple a bash script with the help of chatgpt for prompting. ``` #!/bin/bash # Define the output file output_file="Response.md" # # Ensure the output file is empty or create it if it doesn't exist # > "$output_file" # Loop until the user decides to exit while true; do # Prompt the user for input read -p "Enter something (or type 'exit' to quit): " user_input # Check if the user wants to exit if [[ "$user_input" == "exit" ]]; then echo "Exiting..." break fi # Print the variable echo -e "<h2>====== Start of $user_input ======</h2>" >> "Response.md" # Run the command and append the output directly to the file ollama run phi3:mini "$user_input" >> "$output_file" echo -e "<h2>====== End of $user_input ======</h2>" >> "Response.md" # Inform the user that the response has been saved echo "The response has been saved to $output_file" done ```
Author
Owner

@GAC-Machine commented on GitHub (Jan 14, 2025):

If you're on a Unix-like system (Linux, MacOS) you should be able to just do something like:

ollama run dolphin-phi "prompt" >> response.md

I have tried this using some other models and it worked.

What about Windows ?

<!-- gh-comment-id:2591182657 --> @GAC-Machine commented on GitHub (Jan 14, 2025): > If you're on a Unix-like system (Linux, MacOS) you should be able to just do something like: > > ``` > ollama run dolphin-phi "prompt" >> response.md > ``` > > I have tried this using some other models and it worked. What about Windows ?
Author
Owner

@johntinker commented on GitHub (Feb 3, 2025):

(I did an edit on the script above from shakib04:)

#!/bin/bash

output_file="response.txt"

#Loop until the user decides to exit
while true; do
# Prompt the user for input
read -p "Enter something (or type 'exit' to quit): " user_input

# Check if the user wants to exit
if [[ "$user_input" == "exit" ]]; then
    echo "Exiting..."
    break
fi

# Print the variable
echo -e "<h2>======  Start of $user_input  ======</h2>" >> ${output_file} 

# Run the command and append the output directly to the file
ollama run deepseek-r1:14b "$user_input" >> ${output_file} 

echo -e "<h2>======  End of $user_input  ======</h2>" >> ${output_file} 

# Inform the user that the response has been saved
echo "The response has been saved to ${output_file}"
<!-- gh-comment-id:2631035456 --> @johntinker commented on GitHub (Feb 3, 2025): (I did an edit on the script above from shakib04:) #!/bin/bash output_file="response.txt" #Loop until the user decides to exit while true; do # Prompt the user for input read -p "Enter something (or type 'exit' to quit): " user_input # Check if the user wants to exit if [[ "$user_input" == "exit" ]]; then echo "Exiting..." break fi # Print the variable echo -e "<h2>====== Start of $user_input ======</h2>" >> ${output_file} # Run the command and append the output directly to the file ollama run deepseek-r1:14b "$user_input" >> ${output_file} echo -e "<h2>====== End of $user_input ======</h2>" >> ${output_file} # Inform the user that the response has been saved echo "The response has been saved to ${output_file}"
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#47478