[GH-ISSUE #161] Asking Llama 2 to read a local text file #62099

Closed
opened 2026-05-03 07:31:02 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @ianbheadley on GitHub (Jul 21, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/161

Has anyone been able to get Llama 2 to read a txt file for analysis?

Originally created by @ianbheadley on GitHub (Jul 21, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/161 Has anyone been able to get Llama 2 to read a txt file for analysis?
GiteaMirror added the feature request label 2026-05-03 07:31:02 -05:00
Author
Owner

@BruceMacD commented on GitHub (Jul 21, 2023):

You can pass a text file into the prompt using command substitution, this just adds the content of the file to the prompt. This will be limited by context size in our default models at the moment, which isn't very large.

Here is an example where I have some of the wikipedia article on llamas in a text file:

$ ollama run llama2 "$(cat llama.txt)" please summarize this article
Sure, I'd be happy to summarize the article for you! Here is a brief summary of the main points:
* Llamas are domesticated South American camelids that have been used as meat and pack animals by Andean cultures since the Pre-Columbian era.
* Llamas are social animals that live in herds and their wool is soft and contains a small amount of lanolin. They can learn simple tasks after a few repetitions and can carry about 25-30% of their body weight for 8-13 km (5-8 miles) when using a pack.
<!-- gh-comment-id:1645701082 --> @BruceMacD commented on GitHub (Jul 21, 2023): You can pass a text file into the prompt using command substitution, this just adds the content of the file to the prompt. This will be limited by context size in our default models at the moment, which isn't very large. Here is an example where I have some of the wikipedia article on llamas in a text file: ``` $ ollama run llama2 "$(cat llama.txt)" please summarize this article Sure, I'd be happy to summarize the article for you! Here is a brief summary of the main points: * Llamas are domesticated South American camelids that have been used as meat and pack animals by Andean cultures since the Pre-Columbian era. * Llamas are social animals that live in herds and their wool is soft and contains a small amount of lanolin. They can learn simple tasks after a few repetitions and can carry about 25-30% of their body weight for 8-13 km (5-8 miles) when using a pack. ```
Author
Owner

@mchiang0610 commented on GitHub (Aug 30, 2023):

Thank you for submitting this! This should be resolved.

There is also a demo of this with privateGPT: https://github.com/jmorganca/ollama/tree/main/examples/privategpt

Please let me know if it's not solving your use case.

Thank you!

<!-- gh-comment-id:1699873072 --> @mchiang0610 commented on GitHub (Aug 30, 2023): Thank you for submitting this! This should be resolved. There is also a demo of this with privateGPT: https://github.com/jmorganca/ollama/tree/main/examples/privategpt Please let me know if it's not solving your use case. Thank you!
Author
Owner

@dhowe commented on GitHub (Apr 12, 2025):

You can pass a text file into the prompt using command substitution, this just adds the content of the file to the prompt. This will be limited by context size in our default models at the moment, which isn't very large.

Here is an example where I have some of the wikipedia article on llamas in a text file:

$ ollama run llama2 "$(cat llama.txt)" please summarize this article
Sure, I'd be happy to summarize the article for you! Here is a brief summary of the main points:
* Llamas are domesticated South American camelids that have been used as meat and pack animals by Andean cultures since the Pre-Columbian era.
* Llamas are social animals that live in herds and their wool is soft and contains a small amount of lanolin. They can learn simple tasks after a few repetitions and can carry about 25-30% of their body weight for 8-13 km (5-8 miles) when using a pack.

Is there a way to do this where one is back at the model prompt after, for example, to ask a follow-up question ?

<!-- gh-comment-id:2798778847 --> @dhowe commented on GitHub (Apr 12, 2025): > You can pass a text file into the prompt using command substitution, this just adds the content of the file to the prompt. This will be limited by context size in our default models at the moment, which isn't very large. > > Here is an example where I have some of the wikipedia article on llamas in a text file: > > ``` > $ ollama run llama2 "$(cat llama.txt)" please summarize this article > Sure, I'd be happy to summarize the article for you! Here is a brief summary of the main points: > * Llamas are domesticated South American camelids that have been used as meat and pack animals by Andean cultures since the Pre-Columbian era. > * Llamas are social animals that live in herds and their wool is soft and contains a small amount of lanolin. They can learn simple tasks after a few repetitions and can carry about 25-30% of their body weight for 8-13 km (5-8 miles) when using a pack. > ``` Is there a way to do this where one is back at the model prompt after, for example, to ask a follow-up question ?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#62099