[GH-ISSUE #15889] Cloud-only binaries #72184

Open
opened 2026-05-05 03:36:20 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @ShibuyaCyana on GitHub (Apr 30, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15889

I use ollama on a machine with limited resources and will only connect to official cloud services. It's a waste of bandwidth and disk space to download all these bundled GPU runtimes. Would be happy to see a binary without all these supporting components to run LLM locally.

Related (but different use case):
#6531, #9504, #12268

Originally created by @ShibuyaCyana on GitHub (Apr 30, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15889 I use ollama on a machine with limited resources and will only connect to official cloud services. It's a waste of bandwidth and disk space to download all these bundled GPU runtimes. Would be happy to see a binary without all these supporting components to run LLM locally. Related (but different use case): #6531, #9504, #12268
GiteaMirror added the feature request label 2026-05-05 03:36:20 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#72184