[PR #10717] cmd: add ellipses to truncated show metadata #13339

Closed
opened 2026-04-13 00:24:16 -05:00 by GiteaMirror · 0 comments
Owner

Original Pull Request: https://github.com/ollama/ollama/pull/10717

State: closed
Merged: Yes


When a piece of information has been truncated in the show output an ellipses to indicate that more data has not been displayed.

Not communicating that the data had been truncated caused some confusion when troubleshooting an issue with model metadata.

Sample output:

❯ ollama show llama3.2-vision -v
# omitted unchanged fields here - Bruce
  Metadata
    mllama.vision.intermediate_layers_indices     [3 7 15 ...+2 more]        
    mllama.vision.max_num_tiles                   4                          
    mllama.vision.num_channels                    3                          
    mllama.vision.patch_size                      14                         
    mllama.vocab_size                             128256                     
    tokenizer.ggml.add_bos_token                  false                      
    tokenizer.ggml.add_eos_token                  false                      
    tokenizer.ggml.add_padding_token              false                      
    tokenizer.ggml.bos_token_id                   128000                     
    tokenizer.ggml.eos_token_id                   128009                     
    tokenizer.ggml.merges                         [Ġ Ġ ...+280146 more]      
    tokenizer.ggml.model                          gpt2                       
    tokenizer.ggml.padding_token_id               128004                     
    tokenizer.ggml.pre                            llama-bpe                  
    tokenizer.ggml.scores                         [0 1 2 ...+128254 more]    
    tokenizer.ggml.token_type                     [1 1 1 ...+128254 more]    
    tokenizer.ggml.tokens                         [! " # ...+128254 more]  
**Original Pull Request:** https://github.com/ollama/ollama/pull/10717 **State:** closed **Merged:** Yes --- When a piece of information has been truncated in the show output an ellipses to indicate that more data has not been displayed. Not communicating that the data had been truncated caused some confusion when troubleshooting an issue with model metadata. Sample output: ``` ❯ ollama show llama3.2-vision -v # omitted unchanged fields here - Bruce Metadata mllama.vision.intermediate_layers_indices [3 7 15 ...+2 more] mllama.vision.max_num_tiles 4 mllama.vision.num_channels 3 mllama.vision.patch_size 14 mllama.vocab_size 128256 tokenizer.ggml.add_bos_token false tokenizer.ggml.add_eos_token false tokenizer.ggml.add_padding_token false tokenizer.ggml.bos_token_id 128000 tokenizer.ggml.eos_token_id 128009 tokenizer.ggml.merges [Ġ Ġ ...+280146 more] tokenizer.ggml.model gpt2 tokenizer.ggml.padding_token_id 128004 tokenizer.ggml.pre llama-bpe tokenizer.ggml.scores [0 1 2 ...+128254 more] tokenizer.ggml.token_type [1 1 1 ...+128254 more] tokenizer.ggml.tokens [! " # ...+128254 more] ```
GiteaMirror added the pull-request label 2026-04-13 00:24:16 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#13339