[GH-ISSUE #5596] version is 0.2.1 can't run glm4 #50008

Closed
opened 2026-04-28 13:46:52 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @qiulaidongfeng on GitHub (Jul 10, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5596

Originally assigned to: @jmorganca on GitHub.

What is the issue?

run ollama run glm4

Error: this model is not supported by your version of Ollama. You may need to upgrade

OS

Windows

GPU

AMD

CPU

AMD

Ollama version

ollama version is 0.2.1

Originally created by @qiulaidongfeng on GitHub (Jul 10, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5596 Originally assigned to: @jmorganca on GitHub. ### What is the issue? run `ollama run glm4` Error: this model is not supported by your version of Ollama. You may need to upgrade ### OS Windows ### GPU AMD ### CPU AMD ### Ollama version ollama version is 0.2.1
GiteaMirror added the bug label 2026-04-28 13:46:52 -05:00
Author
Owner

@qiulaidongfeng commented on GitHub (Jul 10, 2024):

reinstall got Error: llama runner process has terminated: exit status 0xc0000409 error:Could not initialize Tensile library

use OLLAMA_LLM_LIBRARY=cpu_avx2 run success.

<!-- gh-comment-id:2220325078 --> @qiulaidongfeng commented on GitHub (Jul 10, 2024): reinstall got `Error: llama runner process has terminated: exit status 0xc0000409 error:Could not initialize Tensile library` use OLLAMA_LLM_LIBRARY=cpu_avx2 run success.
Author
Owner

@jmorganca commented on GitHub (Jul 10, 2024):

Hi there, sorry you hit this issue. Which AMD gpu do you have? Thanks!

<!-- gh-comment-id:2220673424 --> @jmorganca commented on GitHub (Jul 10, 2024): Hi there, sorry you hit this issue. Which AMD gpu do you have? Thanks!
Author
Owner

@qiulaidongfeng commented on GitHub (Jul 10, 2024):

780m

---Original---
From: "Jeffrey @.>
Date: Wed, Jul 10, 2024 22:30 PM
To: @.
>;
Cc: @.@.>;
Subject: Re: [ollama/ollama] version is 0.2.1 can't run glm4 (Issue #5596)

Hi there, sorry you hit this issue. Which AMD gpu do you have? Thanks!


Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: @.***>

<!-- gh-comment-id:2220868669 --> @qiulaidongfeng commented on GitHub (Jul 10, 2024): 780m ---Original--- From: "Jeffrey ***@***.***&gt; Date: Wed, Jul 10, 2024 22:30 PM To: ***@***.***&gt;; Cc: ***@***.******@***.***&gt;; Subject: Re: [ollama/ollama] version is 0.2.1 can't run glm4 (Issue #5596) Hi there, sorry you hit this issue. Which AMD gpu do you have? Thanks! — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: ***@***.***&gt;
Author
Owner

@jmorganca commented on GitHub (Jul 11, 2024):

Closing for https://github.com/ollama/ollama/issues/5622

<!-- gh-comment-id:2221973403 --> @jmorganca commented on GitHub (Jul 11, 2024): Closing for https://github.com/ollama/ollama/issues/5622
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#50008