[GH-ISSUE #1284] Argument list too long #660

Closed
opened 2026-04-12 10:21:01 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @shubhammicrosoft1 on GitHub (Nov 27, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1284

When i am running a summarization using ollama for reading a 7 MB file & summarizing the data on Linux , it reports

(bash: /usr/local/bin/ollama: Argument list too long)

Command used
ollama run llama2 "$(cat data.txt)" please summarize this data

Is this a OS limitation or some configurations that we can update in Ollama

Originally created by @shubhammicrosoft1 on GitHub (Nov 27, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1284 When i am running a summarization using ollama for reading a 7 MB file & summarizing the data on Linux , it reports (bash: /usr/local/bin/ollama: Argument list too long) Command used ollama run llama2 "$(cat data.txt)" please summarize this data Is this a OS limitation or some configurations that we can update in Ollama
Author
Owner

@BruceMacD commented on GitHub (Nov 27, 2023):

This command should work, I did a quick test on Ubuntu to validate. I would guess that there is a character in the file you're opening that is messing with the command input. Maybe try checking for that.

<!-- gh-comment-id:1828205321 --> @BruceMacD commented on GitHub (Nov 27, 2023): This command should work, I did a quick test on Ubuntu to validate. I would guess that there is a character in the file you're opening that is messing with the command input. Maybe try checking for that.
Author
Owner

@easp commented on GitHub (Nov 27, 2023):

Trying to feed it a 7MB file sounds problematic.

First, and probably the issue here, getconf ARG_MAX will show the maximum command line size in Linux and other POSIX compliant systems. On my linux box its 2MB. Feeding the content via standard input (ollama run llama2 "please summarize this data" < data.txt) should get around that, but then you'll run into an even bigger problem.

Second, llama2's context is 4096 tokens. 7MB is going to be a few orders of magnitude more than that. I think the biggest available context in an open source LLM is 200k tokens, which still isn't likely to be enough for 7MB of text.

<!-- gh-comment-id:1828761376 --> @easp commented on GitHub (Nov 27, 2023): Trying to feed it a 7MB file sounds problematic. First, and probably the issue here, `getconf ARG_MAX` will show the maximum command line size in Linux and other POSIX compliant systems. On my linux box its 2MB. Feeding the content via standard input (`ollama run llama2 "please summarize this data" < data.txt`) should get around that, but then you'll run into an even bigger problem. Second, llama2's context is 4096 tokens. 7MB is going to be a few orders of magnitude more than that. I think the biggest available context in an open source LLM is 200k tokens, which still isn't likely to be enough for 7MB of text.
Author
Owner

@technovangelist commented on GitHub (Dec 4, 2023):

It looks like @easp answer addresses your issue. The 7mb file is going to be greater than the context size.

But I think the error message is a bit cryptic and we should fix that to better point to the actual problem.

<!-- gh-comment-id:1839766440 --> @technovangelist commented on GitHub (Dec 4, 2023): It looks like @easp answer addresses your issue. The 7mb file is going to be greater than the context size. But I think the error message is a bit cryptic and we should fix that to better point to the actual problem.
Author
Owner

@mxyng commented on GitHub (Jan 20, 2024):

The error is from bash, not ollama so there's nothing to be done here.

A not-so-recent change also changed the CLI's behaviour to add the contents of stdin to the prompt. An equivalent will be this which won't have the same shell limitations.

ollama run llama2 'please summarize this data' <data.txt
<!-- gh-comment-id:1901374230 --> @mxyng commented on GitHub (Jan 20, 2024): The error is from bash, not ollama so there's nothing to be done here. A not-so-recent change also changed the CLI's behaviour to add the contents of stdin to the prompt. An equivalent will be this which won't have the same shell limitations. ``` ollama run llama2 'please summarize this data' <data.txt ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#660