[GH-ISSUE #5281] update /show to work like command line show #49821

Closed
opened 2026-04-28 13:04:14 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @iplayfast on GitHub (Jun 25, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5281

Originally assigned to: @royjhan on GitHub.

I really like the new

ollama show <model>

feature.
when running ollama from command line or url it would be nice to be able to get the same type of info without actually loading the model and requesting all the individual sections.
Currently

>>> /show
Available Commands:
  /show info         Show details for this model
  /show license      Show model license
  /show modelfile    Show Modelfile for this model
  /show parameters   Show parameters for this model
  /show system       Show system message
  /show template     Show prompt template

I'm thinking something like

/show model <model>

for example would show the same as command line

ollama show llama3
  Model                                              
  	arch            	llama	                              
  	parameters      	8.0B 	                              
  	quantization    	Q4_0 	                              
  	context length  	8192 	                              
  	embedding length	4096 	                              
  	                                                   
  Parameters                                         
  	num_keep	24                   	                      
  	stop    	"<|start_header_id|>"	                      
  	stop    	"<|end_header_id|>"  	                      
  	stop    	"<|eot_id|>"         	                      
  	                                                   
  License                                            
  	META LLAMA 3 COMMUNITY LICENSE AGREEMENT         	  
  	Meta Llama 3 Version Release Date: April 18, 2024	  

OR
/show model_json
which would show the same thing in json format.

Originally created by @iplayfast on GitHub (Jun 25, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5281 Originally assigned to: @royjhan on GitHub. I really like the new ``` ollama show <model> ``` feature. when running ollama from command line or url it would be nice to be able to get the same type of info without actually loading the model and requesting all the individual sections. Currently ``` >>> /show Available Commands: /show info Show details for this model /show license Show model license /show modelfile Show Modelfile for this model /show parameters Show parameters for this model /show system Show system message /show template Show prompt template ``` I'm thinking something like ``` /show model <model> ``` for example would show the same as command line ``` ollama show llama3 Model arch llama parameters 8.0B quantization Q4_0 context length 8192 embedding length 4096 Parameters num_keep 24 stop "<|start_header_id|>" stop "<|end_header_id|>" stop "<|eot_id|>" License META LLAMA 3 COMMUNITY LICENSE AGREEMENT Meta Llama 3 Version Release Date: April 18, 2024 ``` **OR** /show model_json <model> which would show the same thing in json format.
GiteaMirror added the feature request label 2026-04-28 13:04:14 -05:00
Author
Owner

@iplayfast commented on GitHub (Jun 25, 2024):

I've been looking though the source code and I realised there are a number of improvements that could be made in this area.

  1. /show gives family info (ie bert, llama,clip), ollama show should also give this info
  2. In the source code is a function IsEmbedding() which will filter generate and chat. It would be nice to put this into the show commands (both /show model and ollama show model
  3. In the same vein as IsEmbedding() which just searches for keywords "bert" and "nomic-bert" to return true/false IsImageDetection() could be created which shows the "clip" family to indicate that the model can detect images. Which can then be part of the /show operation
<!-- gh-comment-id:2189937727 --> @iplayfast commented on GitHub (Jun 25, 2024): I've been looking though the source code and I realised there are a number of improvements that could be made in this area. 1. /show gives family info (ie bert, llama,clip), ollama show should also give this info 2. In the source code is a function IsEmbedding() which will filter generate and chat. It would be nice to put this into the show commands (both /show model <model> and ollama show model 3. In the same vein as IsEmbedding() which just searches for keywords "bert" and "nomic-bert" to return true/false IsImageDetection() could be created which shows the "clip" family to indicate that the model can detect images. Which can then be part of the /show operation
Author
Owner

@royjhan commented on GitHub (Jun 27, 2024):

Hey! Re 1. /show gives family info (ie bert, llama,clip), ollama show should also give this info, if you run ollama show llava you should see both llama and clip architectures like so:
Screenshot 2024-06-27 at 2 15 54 PM

In general, the architectures cover all the families. You can try for embedding models too.

For consistency with API info gonna hold off on 2 and 3 for now, but feel free to create a separate issue!

<!-- gh-comment-id:2195680250 --> @royjhan commented on GitHub (Jun 27, 2024): Hey! Re 1. /show gives family info (ie bert, llama,clip), ollama show should also give this info, if you run `ollama show llava` you should see both llama and clip architectures like so: <img width="411" alt="Screenshot 2024-06-27 at 2 15 54 PM" src="https://github.com/ollama/ollama/assets/65097070/fda69612-a2da-446d-b261-aa4225d46289"> In general, the architectures cover all the families. You can try for embedding models too. For consistency with API info gonna hold off on 2 and 3 for now, but feel free to create a separate issue!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#49821