--- pagetitle: "TinyTorch — Don't import torch. Build it." --- ```{=html}

Don't import it. Build it.

Build your own ML framework — from tensors to systems.

Preview · Classroom ready Fall 2026

An educational framework for building and optimizing ML — understand how PyTorch, TensorFlow, and JAX really work.

TinyTorch is usable today for self-paced learning and active course pilots. APIs, instructor packaging, and classroom workflows will continue to stabilize through the Fall 2026 classroom release.

Start Building →
... stars on GitHub — add yours and support free ML education
🔧 Build each piece — Tensors, autograd, attention. No magic imports.
📚 Recreate history — Perceptron → CNN → Transformers → MLPerf.
Understand systems — Memory, compute, optimization trade-offs.
🎯 Debug anything — OOM, NaN, slow training—because you built it.
``` ## Recreate ML History Walk through ML history by rebuilding its greatest breakthroughs with YOUR TinyTorch implementations. Click each milestone to see what you'll build and how it shaped modern AI. ```{=html}
1958
The Perceptron
The first trainable neural network
Input → Linear → Sigmoid → Output
1969
XOR Crisis
Minsky & Papert expose limits of single-layer networks
Input → Linear → Sigmoid → FAIL!
1986
MLP Revival
Backpropagation enables deep learning (95%+ MNIST)
Images → Flatten → Linear → ... → Classes
1998
CNN Revolution 🎯
Spatial intelligence unlocks computer vision (75%+ CIFAR-10)
Images → Conv → Pool → ... → Classes
2017
Transformer Era
Attention launches the LLM revolution
Tokens → Attention → FFN → Output
2018–Present
MLPerf Benchmarks
Production optimization (8-16× smaller, 12-40× faster)
Profile → Compress → Accelerate
``` ## Why Build Instead of Use? ```{=html}

"Building systems creates irreversible understanding."

``` :::: {.comparison-grid} ::: {.comparison-bad} [Traditional ML Education]{.comparison-title} ```python import torch model = torch.nn.Linear(784, 10) output = model(input) # When this breaks, you're stuck ``` **Problem**: You can't debug what you don't understand. ::: ::: {.comparison-good} [TinyTorch: Build → Use → Reflect]{.comparison-title} ```python # BUILD it yourself class Linear: def forward(self, x): return x @ self.weight + self.bias # USE it on real data loss.backward() # YOUR autograd ``` **Advantage**: You can debug it because you built it. ::: :::: ## Learning Path Four progressive tiers take you from foundations to production systems: :::: {.tier-grid} ::: {.tier-card .tier-foundation} [**Foundation (01-08)**
Tensors, autograd, layers, training loops](tiers/foundation.qmd) ::: ::: {.tier-card .tier-architecture} [**Architecture (09-13)**
CNNs, attention, transformers, GPT](tiers/architecture.qmd) ::: ::: {.tier-card .tier-optimization} [**Optimization (14-19)**
Profiling, quantization, acceleration](tiers/optimization.qmd) ::: ::: {.tier-card .tier-olympics} [**Torch Olympics (20)**
Competition-ready capstone project](tiers/olympics.qmd) ::: :::: [**The Big Picture**](big-picture.qmd) | [**Getting Started**](getting-started.qmd) | [**Welcome**](preface.qmd) ## Is This For You? :::: {.audience-grid} ::: {.audience-card} [🎓 **Students**]{.audience-title} [Taking ML courses, want to understand what's behind `import torch`]{.audience-desc} ::: ::: {.audience-card} [👩‍🏫 **Instructors**]{.audience-title} [Teaching ML systems with ready-made hands-on labs]{.audience-desc} ::: ::: {.audience-card} [🚀 **Self-learners**]{.audience-title} [Career changers or hobbyists going deeper than tutorials]{.audience-desc} ::: :::: **Prerequisites**: Python + basic linear algebra. No ML experience required. ## Join the Community ```{=html}

See learners building ML systems worldwide

Add yourself to the map · Share your progress · Connect with builders

Part of the MLSysBook project — every ⭐ helps support free ML education

🌍 Join the Community ⭐ Star on GitHub ... 💬 Discuss on GitHub 📬 Get Updates
``` **Next Steps**: [**Quick Start**](getting-started.qmd) (15 min) | [**The Big Picture**](big-picture.qmd) | [**Community**](community.qmd)