[GH-ISSUE #4133] "which/max" command line options to help with sizing. #64606

Open
opened 2026-05-03 18:19:24 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @bigattichouse on GitHub (May 3, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4133

Frequently I have to play with various quants available, I'd like to run instead of downloading and testing each one until I get one that works. This would save us all some bandwidth.

ollama which somemodel to determine which models I can run

ollama max somemodel to choose the largest model from a list that I can run.

Not sure how instruct/chat might interact, since those are usually after the colon perhaps with dashes? ollama which somemodel-instruct ? or some wildcard? ollama which somemodel:*instruct* ?

Originally created by @bigattichouse on GitHub (May 3, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4133 Frequently I have to play with various quants available, I'd like to run instead of downloading and testing each one until I get one that works. This would save us all some bandwidth. `ollama which somemodel` to determine which models I can run `ollama max somemodel` to choose the largest model from a list that I can run. Not sure how instruct/chat might interact, since those are usually after the colon perhaps with dashes? `ollama which somemodel-instruct` ? or some wildcard? `ollama which somemodel:*instruct*` ?
GiteaMirror added the feature request label 2026-05-03 18:19:24 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64606