[GH-ISSUE #4796] Can't answer chinese Starting from 0.1.39 #28785

Closed
opened 2026-04-22 07:18:58 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @figuretom on GitHub (Jun 3, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4796

What is the issue?

Starting from 0.1.39, if you input a Chinese question, ollama will not be able to output a Chinese answer

OS

No response

GPU

No response

CPU

No response

Ollama version

0.1.39+

Originally created by @figuretom on GitHub (Jun 3, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4796 ### What is the issue? Starting from 0.1.39, if you input a Chinese question, ollama will not be able to output a Chinese answer ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.1.39+
GiteaMirror added the bug label 2026-04-22 07:18:58 -05:00
Author
Owner

@figuretom commented on GitHub (Jun 3, 2024):

If you use llama3, phi3, codstral models, it will always answer your questions in English, even if you enter Chinese.

<!-- gh-comment-id:2144729975 --> @figuretom commented on GitHub (Jun 3, 2024): If you use llama3, phi3, codstral models, it will always answer your questions in English, even if you enter Chinese.
Author
Owner

@xigua314 commented on GitHub (Jun 3, 2024):

Bro, it might be because the models are trained primarily on English data, not really ollama's issue, it's an inherent limitation of the model itself. You could try adding a prompt like '请用中文回答:' at the beginning, and maybe then it can respond in Chinese.It's works on llama3.
image

<!-- gh-comment-id:2144844426 --> @xigua314 commented on GitHub (Jun 3, 2024): Bro, it might be because the models are trained primarily on English data, not really ollama's issue, it's an inherent limitation of the model itself. You could try adding a prompt like '请用中文回答:' at the beginning, and maybe then it can respond in Chinese.It's works on llama3. ![image](https://github.com/ollama/ollama/assets/15155003/8d73895b-cc40-41a5-9cbc-26e7d6d54778)
Author
Owner

@figuretom commented on GitHub (Jun 3, 2024):

Bro, it might be because the models are trained primarily on English data, not really ollama's issue, it's an inherent limitation of the model itself. You could try adding a prompt like '请用中文回答:' at the beginning, and maybe then it can respond in Chinese.It's works on llama3. image

Bro, Try versions 0.1.38 and below, there is no need to add such prompt words separately. Ollama will automatically answer based on the input language type. Such restrictive prompts are unfriendly For many users. What if the user wants to use English?

<!-- gh-comment-id:2144866412 --> @figuretom commented on GitHub (Jun 3, 2024): > Bro, it might be because the models are trained primarily on English data, not really ollama's issue, it's an inherent limitation of the model itself. You could try adding a prompt like '请用中文回答:' at the beginning, and maybe then it can respond in Chinese.It's works on llama3. ![image](https://private-user-images.githubusercontent.com/15155003/336029098-8d73895b-cc40-41a5-9cbc-26e7d6d54778.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTc0MTA5NTMsIm5iZiI6MTcxNzQxMDY1MywicGF0aCI6Ii8xNTE1NTAwMy8zMzYwMjkwOTgtOGQ3Mzg5NWItY2M0MC00MWE1LTljYmMtMjZlN2Q2ZDU0Nzc4LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDA2MDMlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwNjAzVDEwMzA1M1omWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWVmNzE4OTZmYzU4NDM0NjA4NzY1MWRmMmU3NmM1NjJkMTE5OWRlMDI1ODkzYzFkMDYzMGQxYTUwMjhlODAwOTkmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.VJt_TqaHebjimI9A8x-v8Hq9MC6pjrvX13DxybJHDnI) Bro, Try versions 0.1.38 and below, there is no need to add such prompt words separately. Ollama will automatically answer based on the input language type. Such restrictive prompts are unfriendly For many users. What if the user wants to use English?
Author
Owner

@jmorganca commented on GitHub (Jun 3, 2024):

Hi there, I'm not able to reproduce this:

% ollama run llama3
>>> 你好! 
😊 你好! (nǐ hǎo) How are you? 😊

However, one tip for languages is to set a system prompt:

% ollama run llama3
>>> /set system "始终用中文回答"
Set system message.
>>> Hi
你好!😊
<!-- gh-comment-id:2145694404 --> @jmorganca commented on GitHub (Jun 3, 2024): Hi there, I'm not able to reproduce this: ``` % ollama run llama3 >>> 你好! 😊 你好! (nǐ hǎo) How are you? 😊 ``` However, one tip for languages is to set a system prompt: ``` % ollama run llama3 >>> /set system "始终用中文回答" Set system message. >>> Hi 你好!😊 ```
Author
Owner

@pdevine commented on GitHub (Jun 3, 2024):

I've found aya is really great at replying in different languages. Llama3 unfortunately really does seem to like to respond in english.

<!-- gh-comment-id:2145980022 --> @pdevine commented on GitHub (Jun 3, 2024): I've found `aya` is really great at replying in different languages. Llama3 unfortunately really does seem to like to respond in english.
Author
Owner

@figuretom commented on GitHub (Jun 4, 2024):

Hi there, I'm not able to reproduce this:

% ollama run llama3
>>> 你好! 
😊 你好! (nǐ hǎo) How are you? 😊

However, one tip for languages is to set a system prompt:

% ollama run llama3
>>> /set system "始终用中文回答"
Set system message.
>>> Hi
你好!😊

Hi there, I'm not able to reproduce this:

% ollama run llama3
>>> 你好! 
😊 你好! (nǐ hǎo) How are you? 😊

However, one tip for languages is to set a system prompt:

% ollama run llama3
>>> /set system "始终用中文回答"
Set system message.
>>> Hi
你好!😊

Hi,ollama not only use for one person, in our company we develop the ollama services for many user,like chatbot,IDE plugin.We can’t expect user you which language.If we specify the system role as chinese ,it will cause more problem.

<!-- gh-comment-id:2146338627 --> @figuretom commented on GitHub (Jun 4, 2024): > Hi there, I'm not able to reproduce this: > > ``` > % ollama run llama3 > >>> 你好! > 😊 你好! (nǐ hǎo) How are you? 😊 > ``` > > However, one tip for languages is to set a system prompt: > > ``` > % ollama run llama3 > >>> /set system "始终用中文回答" > Set system message. > >>> Hi > 你好!😊 > ``` > Hi there, I'm not able to reproduce this: > > ``` > % ollama run llama3 > >>> 你好! > 😊 你好! (nǐ hǎo) How are you? 😊 > ``` > > However, one tip for languages is to set a system prompt: > > ``` > % ollama run llama3 > >>> /set system "始终用中文回答" > Set system message. > >>> Hi > 你好!😊 > ``` Hi,ollama not only use for one person, in our company we develop the ollama services for many user,like chatbot,IDE plugin.We can’t expect user you which language.If we specify the system role as chinese ,it will cause more problem.
Author
Owner

@xigua314 commented on GitHub (Jun 4, 2024):

Hi there, I'm not able to reproduce this:

% ollama run llama3
>>> 你好! 
😊 你好! (nǐ hǎo) How are you? 😊

However, one tip for languages is to set a system prompt:

% ollama run llama3
>>> /set system "始终用中文回答"
Set system message.
>>> Hi
你好!😊

Hi there, I'm not able to reproduce this:

% ollama run llama3
>>> 你好! 
😊 你好! (nǐ hǎo) How are you? 😊

However, one tip for languages is to set a system prompt:

% ollama run llama3
>>> /set system "始终用中文回答"
Set system message.
>>> Hi
你好!😊

Hi,ollama not only use for one person, in our company we develop the ollama services for many user,like chatbot,IDE plugin.We can’t expect user you which language.If we specify the system role as chinese ,it will cause more problem.

I didn't know there was such a setting, thanks for letting me know. Also, can you start two services? One in English and one in Chinese, and switch to each service after detecting the language?

<!-- gh-comment-id:2146443439 --> @xigua314 commented on GitHub (Jun 4, 2024): > > Hi there, I'm not able to reproduce this: > > ``` > > % ollama run llama3 > > >>> 你好! > > 😊 你好! (nǐ hǎo) How are you? 😊 > > ``` > > > > > > > > > > > > > > > > > > > > > > > > However, one tip for languages is to set a system prompt: > > ``` > > % ollama run llama3 > > >>> /set system "始终用中文回答" > > Set system message. > > >>> Hi > > 你好!😊 > > ``` > > > Hi there, I'm not able to reproduce this: > > ``` > > % ollama run llama3 > > >>> 你好! > > 😊 你好! (nǐ hǎo) How are you? 😊 > > ``` > > > > > > > > > > > > > > > > > > > > > > > > However, one tip for languages is to set a system prompt: > > ``` > > % ollama run llama3 > > >>> /set system "始终用中文回答" > > Set system message. > > >>> Hi > > 你好!😊 > > ``` > > Hi,ollama not only use for one person, in our company we develop the ollama services for many user,like chatbot,IDE plugin.We can’t expect user you which language.If we specify the system role as chinese ,it will cause more problem. I didn't know there was such a setting, thanks for letting me know. Also, can you start two services? One in English and one in Chinese, and switch to each service after detecting the language?
Author
Owner

@figuretom commented on GitHub (Jun 4, 2024):

Hi there, I'm not able to reproduce this:

% ollama run llama3
>>> 你好! 
😊 你好! (nǐ hǎo) How are you? 😊

However, one tip for languages is to set a system prompt:

% ollama run llama3
>>> /set system "始终用中文回答"
Set system message.
>>> Hi
你好!😊

Hi there, I'm not able to reproduce this:

% ollama run llama3
>>> 你好! 
😊 你好! (nǐ hǎo) How are you? 😊

However, one tip for languages is to set a system prompt:

% ollama run llama3
>>> /set system "始终用中文回答"
Set system message.
>>> Hi
你好!😊

Hi,ollama not only use for one person, in our company we develop the ollama services for many user,like chatbot,IDE plugin.We can’t expect user you which language.If we specify the system role as chinese ,it will cause more problem.

I didn't know there was such a setting, thanks for letting me know. Also, can you start two services? One in English and one in Chinese, and switch to each service after detecting the language?

In 0.1.38 and before, these additional prompt were not needed, and the system could automatically reply based on the user's language. If users were still required to choose, it would actually be a step backwards.

<!-- gh-comment-id:2146446236 --> @figuretom commented on GitHub (Jun 4, 2024): > > > Hi there, I'm not able to reproduce this: > > > ``` > > > % ollama run llama3 > > > >>> 你好! > > > 😊 你好! (nǐ hǎo) How are you? 😊 > > > ``` > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > However, one tip for languages is to set a system prompt: > > > ``` > > > % ollama run llama3 > > > >>> /set system "始终用中文回答" > > > Set system message. > > > >>> Hi > > > 你好!😊 > > > ``` > > > > > > > Hi there, I'm not able to reproduce this: > > > ``` > > > % ollama run llama3 > > > >>> 你好! > > > 😊 你好! (nǐ hǎo) How are you? 😊 > > > ``` > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > However, one tip for languages is to set a system prompt: > > > ``` > > > % ollama run llama3 > > > >>> /set system "始终用中文回答" > > > Set system message. > > > >>> Hi > > > 你好!😊 > > > ``` > > > > > > Hi,ollama not only use for one person, in our company we develop the ollama services for many user,like chatbot,IDE plugin.We can’t expect user you which language.If we specify the system role as chinese ,it will cause more problem. > > I didn't know there was such a setting, thanks for letting me know. Also, can you start two services? One in English and one in Chinese, and switch to each service after detecting the language? In 0.1.38 and before, these additional prompt were not needed, and the system could automatically reply based on the user's language. If users were still required to choose, it would actually be a step backwards.
Author
Owner

@red-co commented on GitHub (Aug 25, 2024):

Ollama run, no problem,
curl JSON, it will lead ollama only use English responding,

<!-- gh-comment-id:2308748844 --> @red-co commented on GitHub (Aug 25, 2024): Ollama run, no problem, curl JSON, it will lead ollama only use English responding,
Author
Owner

@pdevine commented on GitHub (Sep 5, 2024):

@red-co you can save your system prompt to the model, or just send it with each request. To save it to the model, create a Modelfile with:

FROM <model>
SYSTEM """Always respond in Chinese"""

use ollama create to create the model.

<!-- gh-comment-id:2332154415 --> @pdevine commented on GitHub (Sep 5, 2024): @red-co you can save your system prompt to the model, or just send it with each request. To save it to the model, create a Modelfile with: ``` FROM <model> SYSTEM """Always respond in Chinese""" ``` use `ollama create` to create the model.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28785