[GH-ISSUE #1454] Repeated output during use #26540

Closed
opened 2026-04-22 02:52:31 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @duyaofei on GitHub (Dec 10, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1454

Today I am running yi:34b chat q4 using Ollama_ K_ When encountering repetitive output from the repeating machine during M, I entered the same issue on the official webpage and the output was normal. It is speculated that the problem arose from the output of invisible control characters.
Thank you for your hard work. I hope to have time to solve this problem.

Originally created by @duyaofei on GitHub (Dec 10, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1454 Today I am running yi:34b chat q4 using Ollama_ K_ When encountering repetitive output from the repeating machine during M, I entered the same issue on the official webpage and the output was normal. It is speculated that the problem arose from the output of invisible control characters. Thank you for your hard work. I hope to have time to solve this problem.
GiteaMirror added the bug label 2026-04-22 02:52:31 -05:00
Author
Owner

@PrasannaVnewtglobal commented on GitHub (Jan 3, 2024):

Try with some other chat models like (codellama, llama2, etc..).

<!-- gh-comment-id:1875285265 --> @PrasannaVnewtglobal commented on GitHub (Jan 3, 2024): Try with some other chat models like (codellama, llama2, etc..).
Author
Owner

@pdevine commented on GitHub (Mar 12, 2024):

Hi @duyaofei , sorry about the slow response.

I just pulled yi:34b-chat-q4_K_M (a045fcc68517) which is working with the most up to date version of ollama (0.1.28). You should be able to upgrade to the latest version and then pull the model again and it should work.

I'm going to go ahead and close the issue, but feel free to keep commenting or to reopen the issue if you're still having problems.

<!-- gh-comment-id:1992605631 --> @pdevine commented on GitHub (Mar 12, 2024): Hi @duyaofei , sorry about the slow response. I just pulled `yi:34b-chat-q4_K_M` (a045fcc68517) which is working with the most up to date version of ollama (0.1.28). You should be able to upgrade to the latest version and then pull the model again and it should work. I'm going to go ahead and close the issue, but feel free to keep commenting or to reopen the issue if you're still having problems.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26540