mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-03-12 06:33:34 -05:00
Clean, dependency-driven organization: - Part I (1-5): MLPs for XORNet - Part II (6-10): CNNs for CIFAR-10 - Part III (11-15): Transformers for TinyGPT Key improvements: - Dropped modules 16-17 (regularization/systems) to maintain scope - Moved normalization to module 13 (Part III where it's needed) - Created three CIFAR-10 examples: random, MLP, CNN - Each part introduces ONE major innovation (FC → Conv → Attention) CIFAR-10 now showcases progression: - test_random_baseline.py: ~10% (random chance) - train_mlp.py: ~55% (no convolutions) - train_cnn.py: ~60%+ (WITH Conv2D - shows why convolutions matter!) This follows actual ML history and each module is needed for its capstone.