[GH-ISSUE #4750] Garbage output running llama3 GGUF model #2993

Closed
opened 2026-04-12 13:23:20 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @DiptenduIDEAS on GitHub (May 31, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4750

What is the issue?

I downloaded https://huggingface.co/QuantFactory/Meta-Llama-3-8B-GGUF/blob/main/Meta-Llama-3-8B.Q2_K.gguf
Created a Modelfile using ollama create example -f Modelfile
and ran ollama run example

On asking the question why is the sky blue? on the >>> prompt I am getting garbage (a series of numbers)

image

OS

Windows

GPU

Other

CPU

Intel

Ollama version

0.1.31

Originally created by @DiptenduIDEAS on GitHub (May 31, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4750 ### What is the issue? I downloaded https://huggingface.co/QuantFactory/Meta-Llama-3-8B-GGUF/blob/main/Meta-Llama-3-8B.Q2_K.gguf Created a Modelfile using `ollama create example -f Modelfile` and ran `ollama run example` On asking the _question why is the sky blue?_ on the >>> prompt I am getting garbage (a series of numbers) ![image](https://github.com/ollama/ollama/assets/156412399/af66c002-32cf-42a2-bfb8-5e1edf890248) ### OS Windows ### GPU Other ### CPU Intel ### Ollama version 0.1.31
GiteaMirror added the bug label 2026-04-12 13:23:20 -05:00
Author
Owner

@jmorganca commented on GitHub (Jul 5, 2024):

Hi there, would it be possible to try again? You may be missing a prompt template - but that should be auto detected now 😊 . Sorry you hit this issue.

<!-- gh-comment-id:2210085399 --> @jmorganca commented on GitHub (Jul 5, 2024): Hi there, would it be possible to try again? You may be missing a prompt template - but that should be auto detected now 😊 . Sorry you hit this issue.
Author
Owner

@DiptenduIDEAS commented on GitHub (Jul 9, 2024):

Hello Jeffry,

Thank you for the update.

Can you please share an example as to how a prompt may be created according
to some template
for a command line invocation of the model (rather than from a Python
program).

Regards,

Diptendu Dutta

On Fri, 5 Jul 2024 at 09:34, Jeffrey Morgan @.***>
wrote:

Hi there, would it be possible to try again? You may be missing a prompt
template - but that should be auto detected now 😊 . Sorry you hit this
issue.


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/4750#issuecomment-2210085399,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/BFJKT37YNVKDOIOZKSUDSCDZKYLMTAVCNFSM6AAAAABISSE4J2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMJQGA4DKMZZHE
.
You are receiving this because you authored the thread.Message ID:
@.***>

<!-- gh-comment-id:2216757442 --> @DiptenduIDEAS commented on GitHub (Jul 9, 2024): Hello Jeffry, Thank you for the update. Can you please share an example as to how a prompt may be created according to some template for a command line invocation of the model (rather than from a Python program). Regards, Diptendu Dutta On Fri, 5 Jul 2024 at 09:34, Jeffrey Morgan ***@***.***> wrote: > Hi there, would it be possible to try again? You may be missing a prompt > template - but that should be auto detected now 😊 . Sorry you hit this > issue. > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/4750#issuecomment-2210085399>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/BFJKT37YNVKDOIOZKSUDSCDZKYLMTAVCNFSM6AAAAABISSE4J2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMJQGA4DKMZZHE> > . > You are receiving this because you authored the thread.Message ID: > ***@***.***> >
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2993