[GH-ISSUE #5814] Always output GGGGGGG when encountering problems that will not occur... . #3625

Closed
opened 2026-04-12 14:23:30 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @enryteam on GitHub (Jul 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5814

What is the issue?

https://ollama.com/library/glm4

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.2.7

Originally created by @enryteam on GitHub (Jul 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5814 ### What is the issue? https://ollama.com/library/glm4 ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.2.7
GiteaMirror added the bug label 2026-04-12 14:23:30 -05:00
Author
Owner

@rick-github commented on GitHub (Jul 20, 2024):

dupe https://github.com/ollama/ollama/issues/5668

<!-- gh-comment-id:2241191306 --> @rick-github commented on GitHub (Jul 20, 2024): dupe https://github.com/ollama/ollama/issues/5668
Author
Owner

@kindzhon commented on GitHub (Jul 23, 2024):

me too

<!-- gh-comment-id:2244755555 --> @kindzhon commented on GitHub (Jul 23, 2024): me too
Author
Owner

@AdamWangT commented on GitHub (Jul 25, 2024):

me too too

<!-- gh-comment-id:2249586294 --> @AdamWangT commented on GitHub (Jul 25, 2024): me too too
Author
Owner

@enryteam commented on GitHub (Jul 26, 2024):

只要出现“ GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG” 接着问任何问题 都是“ GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG”

<!-- gh-comment-id:2252010705 --> @enryteam commented on GitHub (Jul 26, 2024): 只要出现“ GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG” 接着问任何问题 都是“ GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG”
Author
Owner

@enryteam commented on GitHub (Aug 19, 2024):

+1

<!-- gh-comment-id:2296198189 --> @enryteam commented on GitHub (Aug 19, 2024): +1
Author
Owner

@jiaolongxue commented on GitHub (Aug 21, 2024):

+1

<!-- gh-comment-id:2302125246 --> @jiaolongxue commented on GitHub (Aug 21, 2024): +1
Author
Owner

@mosoai commented on GitHub (Sep 9, 2024):

same issue here. It looks like its depends with the token-size ...possible ?

<!-- gh-comment-id:2338306159 --> @mosoai commented on GitHub (Sep 9, 2024): same issue here. It looks like its depends with the token-size ...possible ?
Author
Owner

@pdevine commented on GitHub (Sep 12, 2024):

Make sure you're on the latest version of Ollama and you ollama pull glm4 to get the latest version. I just tested it w/ both linux and mac and it's working correctly. I'll go ahead and close this, but we can reopen if people are still seeing problems.

<!-- gh-comment-id:2347337558 --> @pdevine commented on GitHub (Sep 12, 2024): Make sure you're on the latest version of Ollama and you `ollama pull glm4` to get the latest version. I just tested it w/ both linux and mac and it's working correctly. I'll go ahead and close this, but we can reopen if people are still seeing problems.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#3625