[GH-ISSUE #4057] Error when trying to run llama3 #64557

Closed
opened 2026-05-03 18:07:27 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @omagdy7 on GitHub (Apr 30, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4057

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

Just pulled llama3 and when I try to run it I get this weird error:

Error: template: :7:3: executing "" at <.Response>: can't evaluate field Response in type struct { First bool; System string; Prompt string; Context []int }

Note:
mistral:7b runs just fine
I also tried removing the model and re fetching it

Here is the template of llama3 using ollama show llama3 --template:
{{ if .System }}<|start_header_id|>system<|end_header_id|>

{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>

{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>

{{ .Response }}<|eot_id|>

OS

Linux

GPU

Nvidia

CPU

AMD

Ollama version

0.1.7

Originally created by @omagdy7 on GitHub (Apr 30, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4057 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? Just pulled llama3 and when I try to run it I get this weird error: Error: template: :7:3: executing "" at <.Response>: can't evaluate field Response in type struct { First bool; System string; Prompt string; Context []int } Note: mistral:7b runs just fine I also tried removing the model and re fetching it Here is the template of llama3 using `ollama show llama3 --template`: {{ if .System }}<|start_header_id|>system<|end_header_id|> {{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|> {{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|> {{ .Response }}<|eot_id|> ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.1.7
GiteaMirror added the gpubugamd labels 2026-05-03 18:07:28 -05:00
Author
Owner

@reski-rukmantiyo commented on GitHub (May 1, 2024):

Hi, possible due to old version of Ollama?

root@ollama-7d4795fd9f-tn9p8:/# ollama show llama3 --template
{{ if .System }}<|start_header_id|>system<|end_header_id|>

{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>

{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>

{{ .Response }}<|eot_id|>
root@ollama-7d4795fd9f-tn9p8:/# ollama --version
ollama version is 0.1.32

I'm using docker image version of Ollama

<!-- gh-comment-id:2087863968 --> @reski-rukmantiyo commented on GitHub (May 1, 2024): Hi, possible due to old version of Ollama? ``` root@ollama-7d4795fd9f-tn9p8:/# ollama show llama3 --template {{ if .System }}<|start_header_id|>system<|end_header_id|> {{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|> {{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|> {{ .Response }}<|eot_id|> root@ollama-7d4795fd9f-tn9p8:/# ollama --version ollama version is 0.1.32 ``` I'm using docker image version of Ollama
Author
Owner

@omagdy7 commented on GitHub (May 1, 2024):

Yeah apparently something is wrong with the package upstream in arch linux official packages with used the official install script and now works fine.

<!-- gh-comment-id:2088343274 --> @omagdy7 commented on GitHub (May 1, 2024): Yeah apparently something is wrong with the package upstream in arch linux official packages with used the official install script and now works fine.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64557