[GH-ISSUE #2039] web-ui log error loading model: llama.cpp: tensor 'layers.2.ffn_norm.weight' is missing from model #1177

Closed
opened 2026-04-12 10:57:54 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @lpf763827726 on GitHub (Jan 18, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2039

when i run ollama run llama2:13b and ollama run codellama with ollama-webui, and ask 2~3 question, it start to got error, it report error missing something

Issue details

Originally created by @lpf763827726 on GitHub (Jan 18, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2039 when i run `ollama run llama2:13b` and `ollama run codellama` with ollama-webui, and ask 2~3 question, it start to got error, it report error missing something [Issue details](https://github.com/ollama-webui/ollama-webui/issues/507)
Author
Owner

@lpf763827726 commented on GitHub (Jan 18, 2024):

for now when i try systemctl restart service it feel look so find

<!-- gh-comment-id:1897695502 --> @lpf763827726 commented on GitHub (Jan 18, 2024): for now when i try `systemctl restart service` it feel look so find
Author
Owner

@pdevine commented on GitHub (May 17, 2024):

I'm going to assume the issue went stale since this was from quite some time ago. It's most likely that you ran out of memory, however Ollama is quite a bit better at handling memory now. I'll close it, but feel free to keep commenting and we can reopen.

<!-- gh-comment-id:2118428416 --> @pdevine commented on GitHub (May 17, 2024): I'm going to assume the issue went stale since this was from quite some time ago. It's most likely that you ran out of memory, however Ollama is quite a bit better at handling memory now. I'll close it, but feel free to keep commenting and we can reopen.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1177