[GH-ISSUE #7356] Console show formation backslash chars #66732

Closed
opened 2026-05-04 07:59:30 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @lsalamon on GitHub (Oct 25, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7356

What is the issue?

Why any question show text not formatted at console like this:
A number \( a \) has a multiplicative inverse modulo \( n \) if there exists an integer \( b \) such that:
\[ (a \cdot b) \equiv 1 \pmod{n} \]

Ollama Win64 0.3.14.0.
Old version 0.3.10.0 has same issue.

I noticed that sometimes, in daily use, the problem doesn't always appear.

OS

Windows

GPU

Other

CPU

AMD

Ollama version

0.3.14.0

Originally created by @lsalamon on GitHub (Oct 25, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7356 ### What is the issue? Why any question show text not formatted at console like this: <code>A number \\( a \\) has a multiplicative inverse modulo \\( n \\) if there exists an integer \\( b \\) such that: \\[ (a \\cdot b) \\equiv 1 \\pmod{n} \\]</code> Ollama Win64 0.3.14.0. Old version 0.3.10.0 has same issue. I noticed that sometimes, in daily use, the problem doesn't always appear. ### OS Windows ### GPU Other ### CPU AMD ### Ollama version 0.3.14.0
GiteaMirror added the bug label 2026-05-04 07:59:30 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 25, 2024):

This is LaTeX markup. If you put this in a latex renderer, you get a nicely formatted equation.

<!-- gh-comment-id:2437658439 --> @rick-github commented on GitHub (Oct 25, 2024): This is [LaTeX](https://en.wikipedia.org/wiki/LaTeX) markup. If you put this in a [latex renderer](https://arachnoid.com/latex/?equ=%5B%20(a%20%5Ccdot%20b)%20%5Cequiv%201%20%5Cpmod%7Bn%7D%20%5D), you get a nicely formatted equation.
Author
Owner

@lsalamon commented on GitHub (Oct 25, 2024):

This is Latex markup. If you put this in a latex renderer, you get a nicely formatted equation.

Your tip is useful, but I need the text output in the console to be without formatting characters.

<!-- gh-comment-id:2437673103 --> @lsalamon commented on GitHub (Oct 25, 2024): > This is [Latex](https://en.wikipedia.org/wiki/LaTeX) markup. If you put this in a [latex renderer](https://arachnoid.com/latex/?equ=%5B%20(a%20%5Ccdot%20b)%20%5Cequiv%201%20%5Cpmod%7Bn%7D%20%5D), you get a nicely formatted equation. Your tip is useful, but I need the text output in the console to be without formatting characters.
Author
Owner

@rick-github commented on GitHub (Oct 25, 2024):

This is how the model was trained. You can try telling it to not use latex in the prompt that you give it, or in the system message. Which model are you using?

<!-- gh-comment-id:2437679526 --> @rick-github commented on GitHub (Oct 25, 2024): This is how the model was trained. You can try telling it to not use latex in the prompt that you give it, or in the system message. Which model are you using?
Author
Owner

@lsalamon commented on GitHub (Oct 25, 2024):

This is how the model was trained. You can try telling it to not use latex in the prompt that you give it, or in the system message. Which model are you using?

Today is qwen2-math:72b.
Just yesterday I got the answers correctly!

<!-- gh-comment-id:2437683483 --> @lsalamon commented on GitHub (Oct 25, 2024): > This is how the model was trained. You can try telling it to not use latex in the prompt that you give it, or in the system message. Which model are you using? Today is qwen2-math:72b. Just yesterday I got the answers correctly!
Author
Owner

@rick-github commented on GitHub (Oct 25, 2024):

Models answer on a probabilistic basis, unless you set the seed and temperature you will always get variations in the answers.

<!-- gh-comment-id:2437699438 --> @rick-github commented on GitHub (Oct 25, 2024): Models answer on a probabilistic basis, unless you set the `seed` and `temperature` you will always get variations in the answers.
Author
Owner

@lsalamon commented on GitHub (Oct 25, 2024):

Models answer on a probabilistic basis, unless you set the seed and temperature you will always get variations in the answers.

Ok. How I set this on my case?

<!-- gh-comment-id:2437724128 --> @lsalamon commented on GitHub (Oct 25, 2024): > Models answer on a probabilistic basis, unless you set the `seed` and `temperature` you will always get variations in the answers. Ok. How I set this on my case?
Author
Owner

@rick-github commented on GitHub (Oct 25, 2024):

/set parameter seed 1
/set parameter temperature 0

This will reduce the variability in the responses but it won't make the model stop using latex. If the model was trained with data in latex format, that's how it's going to respond.

<!-- gh-comment-id:2437751675 --> @rick-github commented on GitHub (Oct 25, 2024): ``` /set parameter seed 1 /set parameter temperature 0 ``` This will reduce the variability in the responses but it won't make the model stop using latex. If the model was trained with data in latex format, that's how it's going to respond.
Author
Owner

@lsalamon commented on GitHub (Oct 25, 2024):

/set parameter seed 1
/set parameter temperature 0

This will reduce the variability in the responses but it won't make the model stop using latex. If the model was trained with data in latex format, that's how it's going to respond.

What could cause responses to not always contain Latex formatting characters?

<!-- gh-comment-id:2438234539 --> @lsalamon commented on GitHub (Oct 25, 2024): > ``` > /set parameter seed 1 > /set parameter temperature 0 > ``` > > This will reduce the variability in the responses but it won't make the model stop using latex. If the model was trained with data in latex format, that's how it's going to respond. What could cause responses to not always contain Latex formatting characters?
Author
Owner

@rick-github commented on GitHub (Oct 25, 2024):

Models answer on a probabilistic basis. Some of the training data was in latex format, some wasn't. Sometimes the answer will be based on the latex data, sometimes not.

<!-- gh-comment-id:2438272594 --> @rick-github commented on GitHub (Oct 25, 2024): Models answer on a probabilistic basis. Some of the training data was in latex format, some wasn't. Sometimes the answer will be based on the latex data, sometimes not.
Author
Owner

@lsalamon commented on GitHub (Oct 26, 2024):

/set parameter seed 1
/set parameter temperature 0

This will reduce the variability in the responses but it won't make the model stop using latex. If the model was trained with data in latex format, that's how it's going to respond.

What could cause responses to not always contain Latex formatting characters?

Setting the parameters stabilized the responses without markers.

<!-- gh-comment-id:2439766115 --> @lsalamon commented on GitHub (Oct 26, 2024): > > ``` > > /set parameter seed 1 > > /set parameter temperature 0 > > ``` > > > > > > > > > > > > > > > > > > > > > > > > This will reduce the variability in the responses but it won't make the model stop using latex. If the model was trained with data in latex format, that's how it's going to respond. > > What could cause responses to not always contain Latex formatting characters? Setting the parameters stabilized the responses without markers.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66732