[GH-ISSUE #357] Support multi-line input in CLI #25919

Closed
opened 2026-04-22 01:46:14 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @charlesverdad on GitHub (Aug 16, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/357

I'm trying to copy-paste a multi-line query to ollama, but it treats my newlines as an end to my question.

❯ ollama run llama2
>>> I have something like this:
 Sure, please provide the code you have so far, and I will be happy to assist you in resolving any issues or answering any questions you may have. everybody has made mistakes in their coding at some point, and it's nothing to be ashamed of.

>>>
>>> ```
 Thank you for sharing your code with me! However, I notice that there are a few syntax errors in the code you provided. Here are the issues I found:

1. `if` statement without an condition: You have an `if` statement without any condition. An `if` statement should always have a condition to check whether the statement inside the `if` block should be executed or not. For example, you could replace the `if` statement with `if (x > 0)` to make^C

It would be great to make the user experience a bit better by allowing multi-line queries straight from the CLI. I'm not sure how to implement this in terminal but I remember ipython is able to do this.

Originally created by @charlesverdad on GitHub (Aug 16, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/357 I'm trying to copy-paste a multi-line query to ollama, but it treats my newlines as an end to my question. ``` ❯ ollama run llama2 >>> I have something like this: Sure, please provide the code you have so far, and I will be happy to assist you in resolving any issues or answering any questions you may have. everybody has made mistakes in their coding at some point, and it's nothing to be ashamed of. >>> >>> ``` Thank you for sharing your code with me! However, I notice that there are a few syntax errors in the code you provided. Here are the issues I found: 1. `if` statement without an condition: You have an `if` statement without any condition. An `if` statement should always have a condition to check whether the statement inside the `if` block should be executed or not. For example, you could replace the `if` statement with `if (x > 0)` to make^C ``` It would be great to make the user experience a bit better by allowing multi-line queries straight from the CLI. I'm not sure how to implement this in terminal but I remember ipython is able to do this.
GiteaMirror added the documentation label 2026-04-22 01:46:14 -05:00
Author
Owner

@pascalandy commented on GitHub (Aug 16, 2023):

Yes, same here. When I paste long form content, it accepts 2-3 paragraphs and ignore most of my content.

<!-- gh-comment-id:1681096991 --> @pascalandy commented on GitHub (Aug 16, 2023): Yes, same here. When I paste long form content, it accepts 2-3 paragraphs and ignore most of my content.
Author
Owner

@mchiang0610 commented on GitHub (Aug 17, 2023):

Hey @charlesverdad @pascalandy great question! Sorry that we haven't made multiline support clear.

You can do this by running a model, and then doing triple quotes to start the multiline, and then close with triple quotes.

example:

ollama run llama2
>>> """
... This is the first line
... second line
... """
<!-- gh-comment-id:1681566694 --> @mchiang0610 commented on GitHub (Aug 17, 2023): Hey @charlesverdad @pascalandy great question! Sorry that we haven't made multiline support clear. You can do this by running a model, and then doing triple quotes to start the multiline, and then close with triple quotes. example: ``` ollama run llama2 >>> """ ... This is the first line ... second line ... """ ```
Author
Owner

@BruceMacD commented on GitHub (Aug 17, 2023):

This is in the readme now, so resolving this issue.

<!-- gh-comment-id:1682370061 --> @BruceMacD commented on GitHub (Aug 17, 2023): This is in the readme now, so resolving this issue.
Author
Owner

@pascalandy commented on GitHub (Aug 17, 2023):

It works :) Put your prompt within these """ prompt + context + content to evaluate"""

<!-- gh-comment-id:1682971025 --> @pascalandy commented on GitHub (Aug 17, 2023): It works :) Put your prompt within these """ prompt + context + content to evaluate"""
Author
Owner

@rtrad89 commented on GitHub (Nov 7, 2024):

FYI heredocs seem to work fine with multiline-prompts too

% ollama run llama3.1:8b <<-EOD
heredocd> What is
heredocd> the highest
heredocd> mountain in the world?
heredocd> EOD
The highest mountain in the world is Mount Everest, located in the Himalayas on the border between Nepal and Tibet. Its peak elevation is 8,848 meters (29,029 feet) above sea level.
<!-- gh-comment-id:2462107871 --> @rtrad89 commented on GitHub (Nov 7, 2024): FYI heredocs seem to work fine with multiline-prompts too ```bash % ollama run llama3.1:8b <<-EOD heredocd> What is heredocd> the highest heredocd> mountain in the world? heredocd> EOD The highest mountain in the world is Mount Everest, located in the Himalayas on the border between Nepal and Tibet. Its peak elevation is 8,848 meters (29,029 feet) above sea level. ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#25919