diff --git a/tinytorch/milestones/ABOUT.md b/tinytorch/milestones/ABOUT.md index e63a795a9..88dc7223a 100644 --- a/tinytorch/milestones/ABOUT.md +++ b/tinytorch/milestones/ABOUT.md @@ -14,14 +14,14 @@ This isn't a demo - it's proof that you understand ML systems engineering from t ## The Journey -| Year | Milestone | What You'll Build | Unlocked After | +| Year | Milestone | What You'll Build | Required Modules | |------|-----------|-------------------|----------------| -| **1958** | Perceptron | Binary classification with gradient descent | Module 04 | -| **1969** | XOR Crisis | Hidden layers solve non-linear problems | Module 08 | -| **1986** | MLP Revival | Multi-class vision (95%+ MNIST) | Module 08 | -| **1998** | CNN Revolution | Convolutions (70%+ CIFAR-10) | Module 09 | -| **2017** | Transformers | Language generation with attention | Module 13 | -| **2018** | MLPerf | Production optimization pipeline | Module 19 | +| **1958** | Perceptron | First neural network (forward pass) | 01-03 | +| **1969** | XOR Crisis | Experience the AI Winter trigger | 01-03 | +| **1986** | MLP Revival | Backprop solves XOR + digit recognition | 01-08 | +| **1998** | CNN Revolution | Convolutions (70%+ CIFAR-10) | 01-09 | +| **2017** | Transformers | Language generation with attention | 01-08, 11-13 | +| **2018** | MLPerf | Production optimization pipeline | 01-08, 14-19 | ## Why Milestones Transform Learning diff --git a/tinytorch/milestones/README.md b/tinytorch/milestones/README.md index 8f4f8bd34..e3469948f 100644 --- a/tinytorch/milestones/README.md +++ b/tinytorch/milestones/README.md @@ -14,12 +14,12 @@ After completing a set of modules, you unlock the ability to run a milestone. Ea | ID | Name | Year | Required Modules | What You'll Do | |----|------|------|------------------|----------------| -| 01 | Perceptron | 1957 | Part 1: 01-04, Part 2: 01-08 | Build Rosenblatt's first neural network | -| 02 | XOR Crisis | 1969 | Part 1: 01-03, Part 2: 01-08 | Experience and solve the XOR impossibility | -| 03 | MLP Revival | 1986 | 01-08 | Train MLPs on TinyDigits with backpropagation | +| 01 | Perceptron | 1958 | 01-03 | Build Rosenblatt's first neural network (forward pass) | +| 02 | XOR Crisis | 1969 | 01-03 | Experience the XOR limitation that triggered AI Winter | +| 03 | MLP Revival | 1986 | 01-08 | Train MLPs to solve XOR + recognize digits | | 04 | CNN Revolution | 1998 | 01-09 | Build LeNet for image recognition | -| 05 | Transformer Era | 2017 | 01-13 | Build attention and generate text | -| 06 | MLPerf Benchmarks | 2018 | 01-19 | Optimize and benchmark your neural networks | +| 05 | Transformer Era | 2017 | 01-08, 11-13 | Build attention and generate text | +| 06 | MLPerf Benchmarks | 2018 | 01-08, 14-19 | Optimize and benchmark your neural networks | ## Running Milestones diff --git a/tinytorch/site/big-picture.md b/tinytorch/site/big-picture.md index ce9c44d44..661151650 100644 --- a/tinytorch/site/big-picture.md +++ b/tinytorch/site/big-picture.md @@ -125,11 +125,11 @@ Concrete outcomes at each major checkpoint: | After Module | You'll Have Built | Historical Context | |--------------|-------------------|-------------------| -| **01-04** | Working Perceptron classifier | Rosenblatt 1958 | +| **01-03** | Working Perceptron classifier (forward pass) | Rosenblatt 1958 | | **01-08** | MLP solving XOR + complete training pipeline | AI Winter breakthrough 1969→1986 | | **01-09** | CNN with convolutions and pooling | LeNet-5 (1998) | -| **01-13** | GPT model with autoregressive generation | "Attention Is All You Need" (2017) | -| **01-19** | Optimized, quantized, accelerated system | Production ML today | +| **01-08 + 11-13** | GPT model with autoregressive generation | "Attention Is All You Need" (2017) | +| **01-08 + 14-19** | Optimized, quantized, accelerated system | Production ML today | | **01-20** | MLPerf-style benchmarking submission | Torch Olympics | ```{admonition} The North Star Build diff --git a/tinytorch/site/getting-started.md b/tinytorch/site/getting-started.md index f46bda69b..d5a4b73c3 100644 --- a/tinytorch/site/getting-started.md +++ b/tinytorch/site/getting-started.md @@ -157,13 +157,14 @@ You're recreating ML history with your own code. *By Module 19, you'll benchmark As you complete more modules, you unlock more milestones: -| Modules Completed | Milestone Unlocked | What You Recreate | -|-------------------|-------------------|-------------------| -| 01-04 | `perceptron` | The 1958 Perceptron | -| 01-08 | `backprop` | 1986 Backpropagation | -| 01-09 | `lenet` | 1998 LeNet CNN | -| 01-13 | `transformer` | 2017 Transformer | -| 01-19 | `mlperf` | MLPerf Benchmarks | +| Modules Completed | Milestone | What You Recreate | +|-------------------|-----------|-------------------| +| 01-03 | Perceptron (1958) | First neural network (forward pass) | +| 01-03 | XOR Crisis (1969) | The limitation that triggered AI Winter | +| 01-08 | MLP Revival (1986) | Backprop solves XOR + real digit recognition | +| 01-09 | CNN Revolution (1998) | Convolutions for spatial understanding | +| 01-08 + 11-13 | Transformers (2017) | Language generation with attention | +| 01-08 + 14-19 | MLPerf (2018) | Production optimization pipeline | See all milestones and their requirements: diff --git a/tinytorch/site/tito/milestones.md b/tinytorch/site/tito/milestones.md index 00ac606e6..0506356d6 100644 --- a/tinytorch/site/tito/milestones.md +++ b/tinytorch/site/tito/milestones.md @@ -145,7 +145,7 @@ See your journey through ML history in a visual tree format. **What**: Frank Rosenblatt's first trainable neural network -**Requires**: Module 01 (Tensor) +**Requires**: Modules 01-03 (Tensor, Activations, Layers) **What you'll do**: Implement and train the perceptron that proved machines could learn @@ -160,13 +160,13 @@ tito milestone run 01 ### Milestone 02: XOR Crisis (1969) -**What**: Solving the problem that stalled AI research +**What**: Demonstrating the problem that stalled AI research -**Requires**: Modules 01-02 (Tensor, Activations) +**Requires**: Modules 01-03 (Tensor, Activations, Layers) -**What you'll do**: Use multi-layer networks to solve XOR - impossible for single-layer perceptrons +**What you'll do**: Experience how single-layer perceptrons fail on XOR - the limitation that triggered the "AI Winter" -**Historical significance**: Minsky & Papert showed perceptron limitations; this shows how to overcome them +**Historical significance**: Minsky & Papert showed perceptron limitations; this milestone demonstrates the crisis before the solution **Run it**: ```bash @@ -213,7 +213,7 @@ tito milestone run 04 **What**: "Attention is All You Need" -**Requires**: Modules 01-13 (Foundation + Architecture Tiers) +**Requires**: Modules 01-08 + 11-13 (Foundation + Embeddings, Attention, Transformers) **What you'll do**: Implement transformer architecture with self-attention mechanism @@ -230,7 +230,7 @@ tito milestone run 05 **What**: Production ML Systems -**Requires**: Modules 01-19 (Foundation + Architecture + Optimization Tiers) +**Requires**: Modules 01-08 + 14-19 (Foundation + Optimization Tier) **What you'll do**: Optimize for production deployment with quantization, compression, and benchmarking @@ -426,7 +426,7 @@ tito module complete XX **Solution**: 1. Check error message for which module failed -2. Edit `modules/source/XX_name/` (NOT `tinytorch/`) +2. Edit `modules/XX_name/XX_name.ipynb` (NOT `tinytorch/`) 3. Re-export: `tito module complete XX` 4. Run milestone again @@ -435,8 +435,8 @@ tito module complete XX
Start with the Foundation Tier and work toward your first milestone
-Foundation Tier → -Historical Context → +Foundation Tier → +Historical Context →