[PR #6762] [MERGED] refactor show ouput #12219

Closed
opened 2026-04-12 23:52:16 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/ollama/ollama/pull/6762
Author: @mxyng
Created: 9/11/2024
Status: Merged
Merged: 9/11/2024
Merged by: @mxyng

Base: mainHead: mxyng/show-output


📝 Commits (1)

📊 Changes

3 files changed (+277 additions, -105 deletions)

View changed files

📝 cmd/cmd.go (+70 -104)
cmd/cmd_test.go (+206 -0)
📝 cmd/interactive.go (+1 -1)

📄 Description

fixes line wrapping on long texts. the previous code was doing multiple passes through a tablewriting and breaking the intermediate outputs into lines to feed back into a table writer. since some fields are much longer than others, the column widths become inflated causing everything to be filled with whitespace

this change fixes the root issue by using individual tablewriters for each section which allows each section to be rendered independently. any long text will not impact the width of unrelated tables. the output itself should be largely unchanged

$ ollama show maybe
  Model
        parameters              8.0B
        quantization            Q4_0
        architecture            llama
        context length          131072
        embedding length        4096

  Parameters
        stop    "<|start_header_id|>"
        stop    "<|end_header_id|>"
        stop    "<|eot_id|>"

  System
        You are a world-class AI system, capable of complex reasoning and reflection. Reason through the
        query inside <thinking> tags, and then provide your final response inside <output> tags. If you
        detect that you made a mistake in your reasoning at any point, correct yourself inside <reflection>
        tags.

  License
        LLAMA 3.1 COMMUNITY LICENSE AGREEMENT
        Llama 3.1 Version Release Date: July 23, 2024

resolves #6740
resolves #6763


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/ollama/ollama/pull/6762 **Author:** [@mxyng](https://github.com/mxyng) **Created:** 9/11/2024 **Status:** ✅ Merged **Merged:** 9/11/2024 **Merged by:** [@mxyng](https://github.com/mxyng) **Base:** `main` ← **Head:** `mxyng/show-output` --- ### 📝 Commits (1) - [`ecab6f1`](https://github.com/ollama/ollama/commit/ecab6f1cc582a5ce8ee2bfbc780cb9990115a3da) refactor show ouput ### 📊 Changes **3 files changed** (+277 additions, -105 deletions) <details> <summary>View changed files</summary> 📝 `cmd/cmd.go` (+70 -104) ➕ `cmd/cmd_test.go` (+206 -0) 📝 `cmd/interactive.go` (+1 -1) </details> ### 📄 Description fixes line wrapping on long texts. the previous code was doing multiple passes through a tablewriting and breaking the intermediate outputs into lines to feed back into a table writer. since some fields are much longer than others, the column widths become inflated causing everything to be filled with whitespace this change fixes the root issue by using individual tablewriters for each section which allows each section to be rendered independently. any long text will not impact the width of unrelated tables. the output itself should be largely unchanged ``` $ ollama show maybe Model parameters 8.0B quantization Q4_0 architecture llama context length 131072 embedding length 4096 Parameters stop "<|start_header_id|>" stop "<|end_header_id|>" stop "<|eot_id|>" System You are a world-class AI system, capable of complex reasoning and reflection. Reason through the query inside <thinking> tags, and then provide your final response inside <output> tags. If you detect that you made a mistake in your reasoning at any point, correct yourself inside <reflection> tags. License LLAMA 3.1 COMMUNITY LICENSE AGREEMENT Llama 3.1 Version Release Date: July 23, 2024 ``` resolves #6740 resolves #6763 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-12 23:52:16 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#12219