docs: fix milestone requirements to match implementation

Updates milestone documentation across all site files to match the
actual MILESTONE_SCRIPTS configuration in tito/commands/milestone.py:

- Milestone 01 (Perceptron): requires modules 01-03 (not 01-04)
- Milestone 02 (XOR Crisis): requires modules 01-03 (not 01-02)
- Milestone 05 (Transformers): requires 01-08 + 11-13 (not 01-13)
- Milestone 06 (MLPerf): requires 01-08 + 14-19 (not 01-19)

Also fixes broken link to chapters/milestones.html (directory does not
exist) and corrects path to student notebooks.
This commit is contained in:
Vijay Janapa Reddi
2026-01-25 15:33:59 -05:00
parent 96fbd58d1a
commit 9e73f389a5
5 changed files with 33 additions and 32 deletions

View File

@@ -14,14 +14,14 @@ This isn't a demo - it's proof that you understand ML systems engineering from t
## The Journey
| Year | Milestone | What You'll Build | Unlocked After |
| Year | Milestone | What You'll Build | Required Modules |
|------|-----------|-------------------|----------------|
| **1958** | Perceptron | Binary classification with gradient descent | Module 04 |
| **1969** | XOR Crisis | Hidden layers solve non-linear problems | Module 08 |
| **1986** | MLP Revival | Multi-class vision (95%+ MNIST) | Module 08 |
| **1998** | CNN Revolution | Convolutions (70%+ CIFAR-10) | Module 09 |
| **2017** | Transformers | Language generation with attention | Module 13 |
| **2018** | MLPerf | Production optimization pipeline | Module 19 |
| **1958** | Perceptron | First neural network (forward pass) | 01-03 |
| **1969** | XOR Crisis | Experience the AI Winter trigger | 01-03 |
| **1986** | MLP Revival | Backprop solves XOR + digit recognition | 01-08 |
| **1998** | CNN Revolution | Convolutions (70%+ CIFAR-10) | 01-09 |
| **2017** | Transformers | Language generation with attention | 01-08, 11-13 |
| **2018** | MLPerf | Production optimization pipeline | 01-08, 14-19 |
## Why Milestones Transform Learning

View File

@@ -14,12 +14,12 @@ After completing a set of modules, you unlock the ability to run a milestone. Ea
| ID | Name | Year | Required Modules | What You'll Do |
|----|------|------|------------------|----------------|
| 01 | Perceptron | 1957 | Part 1: 01-04, Part 2: 01-08 | Build Rosenblatt's first neural network |
| 02 | XOR Crisis | 1969 | Part 1: 01-03, Part 2: 01-08 | Experience and solve the XOR impossibility |
| 03 | MLP Revival | 1986 | 01-08 | Train MLPs on TinyDigits with backpropagation |
| 01 | Perceptron | 1958 | 01-03 | Build Rosenblatt's first neural network (forward pass) |
| 02 | XOR Crisis | 1969 | 01-03 | Experience the XOR limitation that triggered AI Winter |
| 03 | MLP Revival | 1986 | 01-08 | Train MLPs to solve XOR + recognize digits |
| 04 | CNN Revolution | 1998 | 01-09 | Build LeNet for image recognition |
| 05 | Transformer Era | 2017 | 01-13 | Build attention and generate text |
| 06 | MLPerf Benchmarks | 2018 | 01-19 | Optimize and benchmark your neural networks |
| 05 | Transformer Era | 2017 | 01-08, 11-13 | Build attention and generate text |
| 06 | MLPerf Benchmarks | 2018 | 01-08, 14-19 | Optimize and benchmark your neural networks |
## Running Milestones

View File

@@ -125,11 +125,11 @@ Concrete outcomes at each major checkpoint:
| After Module | You'll Have Built | Historical Context |
|--------------|-------------------|-------------------|
| **01-04** | Working Perceptron classifier | Rosenblatt 1958 |
| **01-03** | Working Perceptron classifier (forward pass) | Rosenblatt 1958 |
| **01-08** | MLP solving XOR + complete training pipeline | AI Winter breakthrough 1969→1986 |
| **01-09** | CNN with convolutions and pooling | LeNet-5 (1998) |
| **01-13** | GPT model with autoregressive generation | "Attention Is All You Need" (2017) |
| **01-19** | Optimized, quantized, accelerated system | Production ML today |
| **01-08 + 11-13** | GPT model with autoregressive generation | "Attention Is All You Need" (2017) |
| **01-08 + 14-19** | Optimized, quantized, accelerated system | Production ML today |
| **01-20** | MLPerf-style benchmarking submission | Torch Olympics |
```{admonition} The North Star Build

View File

@@ -157,13 +157,14 @@ You're recreating ML history with your own code. *By Module 19, you'll benchmark
As you complete more modules, you unlock more milestones:
| Modules Completed | Milestone Unlocked | What You Recreate |
|-------------------|-------------------|-------------------|
| 01-04 | `perceptron` | The 1958 Perceptron |
| 01-08 | `backprop` | 1986 Backpropagation |
| 01-09 | `lenet` | 1998 LeNet CNN |
| 01-13 | `transformer` | 2017 Transformer |
| 01-19 | `mlperf` | MLPerf Benchmarks |
| Modules Completed | Milestone | What You Recreate |
|-------------------|-----------|-------------------|
| 01-03 | Perceptron (1958) | First neural network (forward pass) |
| 01-03 | XOR Crisis (1969) | The limitation that triggered AI Winter |
| 01-08 | MLP Revival (1986) | Backprop solves XOR + real digit recognition |
| 01-09 | CNN Revolution (1998) | Convolutions for spatial understanding |
| 01-08 + 11-13 | Transformers (2017) | Language generation with attention |
| 01-08 + 14-19 | MLPerf (2018) | Production optimization pipeline |
See all milestones and their requirements:

View File

@@ -145,7 +145,7 @@ See your journey through ML history in a visual tree format.
**What**: Frank Rosenblatt's first trainable neural network
**Requires**: Module 01 (Tensor)
**Requires**: Modules 01-03 (Tensor, Activations, Layers)
**What you'll do**: Implement and train the perceptron that proved machines could learn
@@ -160,13 +160,13 @@ tito milestone run 01
### Milestone 02: XOR Crisis (1969)
**What**: Solving the problem that stalled AI research
**What**: Demonstrating the problem that stalled AI research
**Requires**: Modules 01-02 (Tensor, Activations)
**Requires**: Modules 01-03 (Tensor, Activations, Layers)
**What you'll do**: Use multi-layer networks to solve XOR - impossible for single-layer perceptrons
**What you'll do**: Experience how single-layer perceptrons fail on XOR - the limitation that triggered the "AI Winter"
**Historical significance**: Minsky & Papert showed perceptron limitations; this shows how to overcome them
**Historical significance**: Minsky & Papert showed perceptron limitations; this milestone demonstrates the crisis before the solution
**Run it**:
```bash
@@ -213,7 +213,7 @@ tito milestone run 04
**What**: "Attention is All You Need"
**Requires**: Modules 01-13 (Foundation + Architecture Tiers)
**Requires**: Modules 01-08 + 11-13 (Foundation + Embeddings, Attention, Transformers)
**What you'll do**: Implement transformer architecture with self-attention mechanism
@@ -230,7 +230,7 @@ tito milestone run 05
**What**: Production ML Systems
**Requires**: Modules 01-19 (Foundation + Architecture + Optimization Tiers)
**Requires**: Modules 01-08 + 14-19 (Foundation + Optimization Tier)
**What you'll do**: Optimize for production deployment with quantization, compression, and benchmarking
@@ -426,7 +426,7 @@ tito module complete XX
**Solution**:
1. Check error message for which module failed
2. Edit `modules/source/XX_name/` (NOT `tinytorch/`)
2. Edit `modules/XX_name/XX_name.ipynb` (NOT `tinytorch/`)
3. Re-export: `tito module complete XX`
4. Run milestone again
@@ -435,8 +435,8 @@ tito module complete XX
<div style="background: #f8f9fa; padding: 2rem; border-radius: 0.5rem; margin: 2rem 0; text-align: center;">
<h3 style="margin: 0 0 1rem 0; color: #495057;">Ready to Recreate ML History?</h3>
<p style="margin: 0 0 1.5rem 0; color: #6c757d;">Start with the Foundation Tier and work toward your first milestone</p>
<a href="tiers/foundation.html" style="display: inline-block; background: #007bff; color: white; padding: 0.75rem 1.5rem; border-radius: 0.25rem; text-decoration: none; font-weight: 500; margin-right: 1rem;">Foundation Tier →</a>
<a href="chapters/milestones.html" style="display: inline-block; background: #6f42c1; color: white; padding: 0.75rem 1.5rem; border-radius: 0.25rem; text-decoration: none; font-weight: 500;">Historical Context →</a>
<a href="../tiers/foundation.html" style="display: inline-block; background: #007bff; color: white; padding: 0.75rem 1.5rem; border-radius: 0.25rem; text-decoration: none; font-weight: 500; margin-right: 1rem;">Foundation Tier →</a>
<a href="../milestones/milestones_ABOUT.html" style="display: inline-block; background: #6f42c1; color: white; padding: 0.75rem 1.5rem; border-radius: 0.25rem; text-decoration: none; font-weight: 500;">Historical Context →</a>
</div>