[GH-ISSUE #656] CLI run output not standard output #294

Closed
opened 2026-04-12 09:50:14 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @reustle on GitHub (Sep 30, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/656

Hello, I've been on this for quite some time now, and I'm sorry if I'm misinformed.

To me, it seems like even when I use the command line argument style input such as ollama run mistral "Here is my prompt" (as mentioned here https://github.com/jmorganca/ollama#pass-in-prompt-as-arguments ), the output isn't clean text.

When I run that command manually, while it should be just straight text with newline characters, instead it is doing some other characters to always fit the width of the terminal that called the command. Here's an example of ollama run mistral "Here is my prompt" > out.txt. It is adding some strange characters in the output.

image

I think this has to do with how Ollama handles terminal commands, similar to when you use interactive chat mode. I would expect that the little loading ascii icon should not show when I'm using it as a standard command line tool (passing in the prompt directly vs chat mode).

If my understanding is correct here, and you're in agreement that we shouldn't be using the fancy terminal features and instead just outputing as STDOUT when it is finished processing, I'm happy to take a swing at creating a PR to fix this case.

Thank you!

Originally created by @reustle on GitHub (Sep 30, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/656 Hello, I've been on this for quite some time now, and I'm sorry if I'm misinformed. To me, it seems like even when I use the command line argument style input such as `ollama run mistral "Here is my prompt"` (as mentioned here https://github.com/jmorganca/ollama#pass-in-prompt-as-arguments ), the output isn't clean text. When I run that command manually, while it should be just straight text with newline characters, instead it is doing some other characters to always fit the width of the terminal that called the command. Here's an example of `ollama run mistral "Here is my prompt" > out.txt`. It is adding some strange characters in the output. <img width="256" alt="image" src="https://github.com/jmorganca/ollama/assets/304560/6e8152e9-b5c8-448c-9d7c-9d2fc8661924"> I think this has to do with how Ollama handles terminal commands, similar to when you use interactive chat mode. I would expect that the little loading ascii icon should not show when I'm using it as a standard command line tool (passing in the prompt directly vs chat mode). If my understanding is correct here, and you're in agreement that we shouldn't be using the fancy terminal features and instead just outputing as STDOUT when it is finished processing, I'm happy to take a swing at creating a PR to fix this case. Thank you!
GiteaMirror added the bug label 2026-04-12 09:50:14 -05:00
Author
Owner

@jmorganca commented on GitHub (Sep 30, 2023):

Hi @reustle thanks for reporting this! Indeed, word wrapping (what those escape characters are used for) should only happen in interactive mode, not when running ollama run mistral "example prompt" and piping data in/out via stdin/stdout.

We'll make sure this gets fixed and in the meantime do feel free to open a PR, that would be awesome!

cc @pdevine

<!-- gh-comment-id:1741825440 --> @jmorganca commented on GitHub (Sep 30, 2023): Hi @reustle thanks for reporting this! Indeed, word wrapping (what those escape characters are used for) should only happen in interactive mode, not when running `ollama run mistral "example prompt"` and piping data in/out via stdin/stdout. We'll make sure this gets fixed and in the meantime do feel free to open a PR, that would be awesome! cc @pdevine
Author
Owner

@pdevine commented on GitHub (Sep 30, 2023):

This is addressed in #662, however, as a work around you can use ollama run mistral --nowordwrap "Here is my prompt" > out.txt

<!-- gh-comment-id:1741884127 --> @pdevine commented on GitHub (Sep 30, 2023): This is addressed in #662, however, as a work around you can use `ollama run mistral --nowordwrap "Here is my prompt" > out.txt`
Author
Owner

@reustle commented on GitHub (Oct 1, 2023):

Fantastic, thanks so much

<!-- gh-comment-id:1741890199 --> @reustle commented on GitHub (Oct 1, 2023): Fantastic, thanks so much
Author
Owner

@pdevine commented on GitHub (Oct 2, 2023):

Should be fixed now.

<!-- gh-comment-id:1743581897 --> @pdevine commented on GitHub (Oct 2, 2023): Should be fixed now.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#294