[GH-ISSUE #9476] Chinese output with llama3.1, llama3.2 and llama3.3 broken #31935

Closed
opened 2026-04-22 12:45:38 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @p3d-dev on GitHub (Mar 3, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9476

What is the issue?

Using ollama with llama3 models on macOS, one of the latest upgrades perhaps to 0.5.11 or 0.5.12 has broken output of chinese characters. Approximately one month ago, llama3.3 could chat flawlessly with chinese characters. Now it doesn’t output any chinese characters anymore.

Relevant log output

Mac Shell % ollama run llama3.3:70b
>>> please translate to chinese: "Hello"
(nǐ hǎo)

(Note: (nǐ hǎo) is a formal way of saying "hello" in Mandarin Chinese. For informal settings, you can use (nĭ hăo) or simply (hāi))

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.5.11 and 0.5.12

Originally created by @p3d-dev on GitHub (Mar 3, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9476 ### What is the issue? Using ollama with llama3 models on macOS, one of the latest upgrades perhaps to 0.5.11 or 0.5.12 has broken output of chinese characters. Approximately one month ago, llama3.3 could chat flawlessly with chinese characters. Now it doesn’t output any chinese characters anymore. ### Relevant log output ```shell Mac Shell % ollama run llama3.3:70b >>> please translate to chinese: "Hello" (nǐ hǎo) (Note: (nǐ hǎo) is a formal way of saying "hello" in Mandarin Chinese. For informal settings, you can use (nĭ hăo) or simply (hāi)) ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.5.11 and 0.5.12
GiteaMirror added the bug label 2026-04-22 12:45:38 -05:00
Author
Owner

@BruceMacD commented on GitHub (Mar 4, 2025):

Hi @p3d-dev, I went back as far as v0.5.4 and see the same output as you. I'd suggest anchoring the model by posing the question in Mandarin to get the results you're after:

❯ ollama run llama3.3:70b
>>> 请翻译成中文: "hello"
你好!
<!-- gh-comment-id:2695909093 --> @BruceMacD commented on GitHub (Mar 4, 2025): Hi @p3d-dev, I went back as far as `v0.5.4` and see the same output as you. I'd suggest anchoring the model by posing the question in Mandarin to get the results you're after: ```bash ❯ ollama run llama3.3:70b >>> 请翻译成中文: "hello" 你好! ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#31935