[GH-ISSUE #13587] backspace x line wrapping bug #71004

Open
opened 2026-05-04 23:42:45 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @SeeTwoo on GitHub (Dec 30, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13587

What is the issue?

I have issues with line (prompt) editing in the terminal. When I backspace over a line wrap, one character is left undeleted on the screen but it does not seem to affect the buffer state.

it is for example possible to write a long prompt that triggers line wrapping, delete it all by using backspace and type /bye and it quits as it should even though a character is still visible.

I also noticed issues of the same class when resizing the terminal where visual representation and buffer state seamingly go out of sync.

I am using ubuntu 22.04.1 with the gnome terminal and ollama 0.13.5.

I’ve implemented line-editing logic in a shell before and ran into a similar issue; I’m happy to investigate or submit a PR if you think that would help. If I were to look into this, could you point me to the part of the codebase responsible for line editing / terminal input handling ?

Relevant log output

~ ollama --version
ollama version is 0.13.5
 ~ ollama run llama3.1:8b
>>> this is a really long prompt that will trigger line wrapping soon and he
... re we are


*after hitting bacspace a few times

 ~ ollama run llama3.1:8b
>>> this is a really long prompt that will trigger line wr                h

OS

Linux

GPU

Nvidia

CPU

AMD

Ollama version

0.13.5

Originally created by @SeeTwoo on GitHub (Dec 30, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13587 ### What is the issue? I have issues with line (prompt) editing in the terminal. When I backspace over a line wrap, one character is left undeleted on the screen but it does not seem to affect the buffer state. it is for example possible to write a long prompt that triggers line wrapping, delete it all by using backspace and type /bye and it quits as it should even though a character is still visible. I also noticed issues of the same class when resizing the terminal where visual representation and buffer state seamingly go out of sync. I am using ubuntu 22.04.1 with the gnome terminal and ollama 0.13.5. I’ve implemented line-editing logic in a shell before and ran into a similar issue; I’m happy to investigate or submit a PR if you think that would help. If I were to look into this, could you point me to the part of the codebase responsible for line editing / terminal input handling ? ### Relevant log output ```shell ~ ollama --version ollama version is 0.13.5 ~ ollama run llama3.1:8b >>> this is a really long prompt that will trigger line wrapping soon and he ... re we are *after hitting bacspace a few times ~ ollama run llama3.1:8b >>> this is a really long prompt that will trigger line wr h ``` ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.13.5
GiteaMirror added the bug label 2026-05-04 23:42:45 -05:00
Author
Owner

@Savior-Bei-Fong commented on GitHub (Dec 30, 2025):

FROM CHATGPT... for the problem resolving
to fix it

This behavior is not a model loading error
It is a known terminal / TTY input bug in older Ollama versions (including 0.13.5) that occurs when:
Long prompts
Automatic line wrapping
Backspace or cursor movement
are combined.
🔍 What exactly is happening?
Symptoms
You start Ollama:
Code copyied
Bash
ollama run llama3.1:8b
You type a very long prompt
The terminal wraps the line automatically
You press Backspace or move the cursor
Result:
Input gets corrupted
Text is cut off
Cursor jumps
Ollama stops reacting correctly
Important
The model is NOT broken
There is no inference or loading error
The issue is purely terminal input handling
🧠 Technical Root Cause
Ollama 0.13.x has known issues with:
readline / terminal raw mode
Multi-line input handling
Backspace behavior on wrapped lines
Incorrect cursor position reporting by terminals
Internally:
Ollama processes input line-based
The terminal reports wrong cursor positions
Backspace deletes the wrong characters
The prompt buffer becomes inconsistent
This happens especially on:
PowerShell
CMD
Some WSL terminals
Older terminal builds
Recommended Fixes (Best → Acceptable)
1️⃣ Do NOT type long prompts interactively (BEST solution)
Use a pipe:
Code copying
Bash
echo "your very long prompt here ..." | ollama run llama3.1:8b
Or a file:
Code copying
Bash
ollama run llama3.1:8b < prompt.txt
✔ 100% stable
✔ No line wrapping
✔ No cursor issues
2️⃣ Use --prompt
Code copying
Bash
ollama run llama3.1:8b --prompt "Very long prompt here ..."
3️⃣ Intentionally add line breaks
Instead of one huge line:
Code copying

This is a long prompt.
With intentional line breaks.
This avoids wrapping bugs.
🔄 Update Recommendation (IMPORTANT)
Ollama 0.13.5 is outdated
Newer versions partially fixed these TTY bugs
Check version
Code copying
Bash

___________ this is what you need _____________

ollama version
Update
Linux / macOS
Code copying
Bash
curl -fsSL https://ollama.com/install.sh | sh
Windows Download the latest installer: https://ollama.com/download
The installer safely overwrites the old version.
🧪 Terminal Compatibility
More problematic
PowerShell (older builds)
CMD
Some WSL terminals
More stable
Windows Terminal (up to date)
Linux: bash, zsh
macOS Terminal
iTerm2
🛠 Debug & Diagnostics
Enable debug output:
Code kopieren
Bash
OLLAMA_DEBUG=1 ollama run llama3.1:8b
View logs:
Code kopieren
Bash
ollama logs
🧩 Summary
✔ No model bug
✔ No llama3.1 issue
✔ Pure TTY / input handling problem
Best Practices
Use files or pipes for long prompts
Keep Ollama updated
Use interactive mode only for short input

<!-- gh-comment-id:3700112885 --> @Savior-Bei-Fong commented on GitHub (Dec 30, 2025): FROM CHATGPT... for the problem resolving to fix it This behavior is not a model loading error It is a known terminal / TTY input bug in older Ollama versions (including 0.13.5) that occurs when: Long prompts Automatic line wrapping Backspace or cursor movement are combined. 🔍 What exactly is happening? Symptoms You start Ollama: Code copyied Bash ollama run llama3.1:8b You type a very long prompt The terminal wraps the line automatically You press Backspace or move the cursor Result: Input gets corrupted Text is cut off Cursor jumps Ollama stops reacting correctly Important ❌ The model is NOT broken ❌ There is no inference or loading error ✅ The issue is purely terminal input handling 🧠 Technical Root Cause Ollama 0.13.x has known issues with: readline / terminal raw mode Multi-line input handling Backspace behavior on wrapped lines Incorrect cursor position reporting by terminals Internally: Ollama processes input line-based The terminal reports wrong cursor positions Backspace deletes the wrong characters The prompt buffer becomes inconsistent This happens especially on: PowerShell CMD Some WSL terminals Older terminal builds ✅ Recommended Fixes (Best → Acceptable) 1️⃣ Do NOT type long prompts interactively (BEST solution) Use a pipe: Code copying Bash echo "your very long prompt here ..." | ollama run llama3.1:8b Or a file: Code copying Bash ollama run llama3.1:8b < prompt.txt ✔ 100% stable ✔ No line wrapping ✔ No cursor issues 2️⃣ Use --prompt Code copying Bash ollama run llama3.1:8b --prompt "Very long prompt here ..." 3️⃣ Intentionally add line breaks Instead of one huge line: Code copying This is a long prompt. With intentional line breaks. This avoids wrapping bugs. 🔄 Update Recommendation (IMPORTANT) Ollama 0.13.5 is outdated Newer versions partially fixed these TTY bugs Check version Code copying Bash ___________ this is what you need _____________ ollama version Update Linux / macOS Code copying Bash curl -fsSL https://ollama.com/install.sh | sh Windows Download the latest installer: https://ollama.com/download The installer safely overwrites the old version. 🧪 Terminal Compatibility ❌ More problematic PowerShell (older builds) CMD Some WSL terminals ✅ More stable Windows Terminal (up to date) Linux: bash, zsh macOS Terminal iTerm2 🛠 Debug & Diagnostics Enable debug output: Code kopieren Bash OLLAMA_DEBUG=1 ollama run llama3.1:8b View logs: Code kopieren Bash ollama logs 🧩 Summary ✔ No model bug ✔ No llama3.1 issue ✔ Pure TTY / input handling problem Best Practices Use files or pipes for long prompts Keep Ollama updated Use interactive mode only for short input
Author
Owner

@rick-github commented on GitHub (Dec 31, 2025):

@Savior-Bei-Fong Please don't post AI slop that has obvious errors.

<!-- gh-comment-id:3701342113 --> @rick-github commented on GitHub (Dec 31, 2025): @Savior-Bei-Fong Please don't post AI slop that has obvious errors.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71004