[GH-ISSUE #14239] Too many blank lines between numbered bullet in the answer. #35032

Open
opened 2026-04-22 19:08:58 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @bulrush15 on GitHub (Feb 13, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14239

What is the issue?

OS: Windows 11 25h2 build 26200.7840
Video card: NVidia GeForce RTX 3060, 12GB VRAM
Ollama: v0.16.0
Model: Gemma3
Prompt method: Via Ollama GUI, not command line.

When I ask for an answer the normal bullets the AI replies with have one blank line between them which is fine. When the AI replies with numbered bullets there are 2 blank lines between each numbered bullet. This takes up too much vertical space.

Screen shot below.

Image

Can this be fixed please?

Relevant log output


OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.16.0

Originally created by @bulrush15 on GitHub (Feb 13, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14239 ### What is the issue? OS: Windows 11 25h2 build 26200.7840 Video card: NVidia GeForce RTX 3060, 12GB VRAM Ollama: v0.16.0 Model: Gemma3 Prompt method: Via Ollama GUI, not command line. When I ask for an answer the normal bullets the AI replies with have one blank line between them which is fine. When the AI replies with numbered bullets there are 2 blank lines between each numbered bullet. This takes up too much vertical space. Screen shot below. <img width="682" height="480" alt="Image" src="https://github.com/user-attachments/assets/b9572cc7-1cdd-46b7-8665-5392184d5519" /> Can this be fixed please? ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.16.0
GiteaMirror added the appbug labels 2026-04-22 19:08:58 -05:00
Author
Owner

@rick-github commented on GitHub (Feb 14, 2026):

Additional information: the issue is with the rendering in the ollama GUI. For example, open the ollama GUI and prompt: "Create a bulleted point list of three reasons why the sky is blue". The ollama GUI will render the list with extra space between the list entries. Do it with another GUI, eg OpenWebUI, and it doesn't.

<!-- gh-comment-id:3902151762 --> @rick-github commented on GitHub (Feb 14, 2026): Additional information: the issue is with the rendering in the ollama GUI. For example, open the ollama GUI and prompt: "Create a bulleted point list of three reasons why the sky is blue". The ollama GUI will render the list with extra space between the list entries. Do it with another GUI, eg OpenWebUI, and it doesn't.
Author
Owner

@ryanmon1 commented on GitHub (Feb 15, 2026):

Additional information: the issue is with the rendering in the ollama GUI. For example, open the ollama GUI and prompt: "Create a bulleted point list of three reasons why the sky is blue". The ollama GUI will render the list with extra space between the list entries. Do it with another GUI, eg OpenWebUI, and it doesn't.

This is not entirely true. I am having the same issue using Ollama to generate synthetic datasets using Easy Dataset and Kiln AI.
I'm getting excessive spacing on outputs generated, 12,000 Q&A pairs made yesterday on Easy Dataset, and every single response from the Ollama models have this spacing problem.
This issue also is affecting defined schemas and results in errors more often than not.
The last Dataset I made using Ollama was about 3-4 months ago and this spacing problem didn't exist at that time. This issue has been observed personally across multiple models but only in Ollama.
I tested the same models in LM Studio/Msty/Jan/llama.cpp/vLLM and there are no spacing issues at all.

Might not be a huge issue for the majority of people but for what I am doing Ollama cannot be used anymore.

<!-- gh-comment-id:3904819184 --> @ryanmon1 commented on GitHub (Feb 15, 2026): > Additional information: the issue is with the rendering in the ollama GUI. For example, open the ollama GUI and prompt: "Create a bulleted point list of three reasons why the sky is blue". The ollama GUI will render the list with extra space between the list entries. Do it with another GUI, eg OpenWebUI, and it doesn't. This is not entirely true. I am having the same issue using Ollama to generate synthetic datasets using Easy Dataset and Kiln AI. I'm getting excessive spacing on outputs generated, 12,000 Q&A pairs made yesterday on Easy Dataset, and every single response from the Ollama models have this spacing problem. This issue also is affecting defined schemas and results in errors more often than not. The last Dataset I made using Ollama was about 3-4 months ago and this spacing problem didn't exist at that time. This issue has been observed personally across multiple models but only in Ollama. I tested the same models in LM Studio/Msty/Jan/llama.cpp/vLLM and there are no spacing issues at all. Might not be a huge issue for the majority of people but for what I am doing Ollama cannot be used anymore.
Author
Owner

@rick-github commented on GitHub (Feb 15, 2026):

Different problem. Open an issue, provide logs and example code.

<!-- gh-comment-id:3904842328 --> @rick-github commented on GitHub (Feb 15, 2026): Different problem. Open an issue, provide logs and example code.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#35032