Files
Vijay Janapa Reddi 99249d00b3 fix(tinytorch): restore seeded Linear init, scope unseeding to perceptron demo
#1617 unseeded the module-level rng in tinytorch/src/03_layers/03_layers.py
to make the perceptron milestone produce different weights on every run.
But Linear.__init__ reads weight init from that rng, so the change made
Linear init nondeterministic across the entire codebase. The integration
test tests/integration/test_training_flow.py::test_deep_network_gradient_chain
specifically depends on Linear having stable init; empirical sweep over
5000 random draws shows the test fails 27.5% of the time under unseeded
init, which is why the Windows CI run failed in #1617.

Restore the seed=7 default in 03_layers.py and instead rebind layers.rng
to an unseeded RNG locally inside the perceptron milestone, mirroring
the pattern already used by the XOR milestone (#1618). This keeps the
perceptron's "different weights every run" promise without breaking
unrelated tests.
2026-05-01 13:18:43 -04:00
..

TinyTorch Milestones

Milestones are capstone experiences that bring together everything you've built in the TinyTorch modules. Each milestone recreates a pivotal moment in ML history using YOUR implementations.

How Milestones Work

After completing a set of modules, you unlock the ability to run a milestone. Each milestone:

  1. Uses YOUR code - Every tensor operation, gradient computation, and layer runs on code YOU wrote
  2. Recreates history - Experience the same breakthroughs researchers achieved decades ago
  3. Proves understanding - If it works, you truly understand how these systems function

Available Milestones

ID Name Year Required Modules What You'll Do
01 Perceptron 1958 01-03 Build Rosenblatt's first neural network (forward pass)
02 XOR Crisis 1969 01-03 Experience the XOR limitation that triggered AI Winter
03 MLP Revival 1986 01-08 Train MLPs to solve XOR + recognize digits
04 CNN Revolution 1998 01-09 Build LeNet for image recognition
05 Transformer Era 2017 01-08, 11-13 Prove attention with reversal, copy, and mixed sequence tasks
06 MLPerf Benchmarks 2018 01-08, 14-19 Optimize and benchmark your neural networks

Running Milestones

# List available milestones and your progress
tito milestone list

# Run a specific milestone (all parts)
tito milestone run 03

# Run a specific part of a multi-part milestone
tito milestone run 03 --part 1  # Part 1: XOR Solved
tito milestone run 03 --part 2  # Part 2: TinyDigits

# Get detailed info about a milestone
tito milestone info 05

Directory Structure

milestones/
├── 01_1958_perceptron/     # Milestone 01: Rosenblatt's Perceptron
├── 02_1969_xor/            # Milestone 02: XOR Problem
├── 03_1986_mlp/            # Milestone 03: Backpropagation MLP
├── 04_1998_cnn/            # Milestone 04: LeNet CNN
├── 05_2017_transformer/    # Milestone 05: Attention Mechanism
├── 06_2018_mlperf/         # Milestone 06: Optimization Olympics
└── data_manager.py         # Shared dataset management utility

The Journey

Milestone progression from Perceptron (1958) through Transformer (2017) to MLPerf (2018)

Success Criteria

Each milestone has specific success criteria. Passing means your implementation is correct:

  • Milestone 01: Forward pass produces reasonable outputs
  • Milestone 02: Demonstrates XOR is unsolvable with single layer (75% max accuracy)
  • Milestone 03: Part 1 solves XOR (100% accuracy), Part 2 achieves 85%+ on TinyDigits
  • Milestone 04: TinyDigits achieves 90%+ accuracy with CNN
  • Milestone 05: Pass all three attention challenges (95%+ accuracy)
  • Milestone 06: Part 1 completes optimization pipeline, Part 2 shows KV cache speedup

Troubleshooting

If a milestone fails:

  1. Check that all required modules are completed: tito module status
  2. Run the module tests: tito test <module_number>
  3. Look at the specific error message for debugging hints
  4. Review the milestone's docstring for implementation requirements