All module references updated to reflect new ordering: - Module 15: Quantization (was 16) - Module 16: Compression (was 17) - Module 17: Memoization (was 15) Updated by module-developer and website-manager agents: - Module ABOUT files with correct numbers and prerequisites - Cross-references and "What's Next" chains - Website navigation (_toc.yml) and content - Learning path progression in LEARNING_PATH.md - Profile milestone completion message (Module 17) Pedagogical flow now: Profile → Quantize → Prune → Cache → Accelerate
8.4 KiB
Quick Start Guide
From Zero to Building Neural Networks
Complete setup + first module in 15 minutes
Purpose: Get hands-on experience building ML systems in 15 minutes. Complete setup verification and build your first neural network component from scratch.
⚡ 2-Minute Setup
Let's get you ready to build ML systems:
Step 1: One-Command Setup
# Clone repository
git clone https://github.com/mlsysbook/TinyTorch.git
cd TinyTorch
# Automated setup (handles everything!)
./setup-environment.sh
# Activate environment
source activate.sh
What this does:
- ✅ Creates optimized virtual environment (arm64 on Apple Silicon)
- ✅ Installs all dependencies (NumPy, Jupyter, Rich, PyTorch for validation)
- ✅ Configures TinyTorch in development mode
- ✅ Verifies installation
📖 See Essential Commands for detailed workflow and troubleshooting.
Step 2: Verify Setup
# Run system diagnostics
tito system doctor
You should see all green checkmarks! This confirms your environment is ready for hands-on ML systems building.
📖 See Essential Commands for verification commands and troubleshooting.
🏗️ 15-Minute First Module Walkthrough
Let's build your first neural network component and unlock your first capability:
Module 01: Tensor Foundations
🎯 Learning Goal: Build N-dimensional arrays - the foundation of all neural networks
⏱️ Time: 15 minutes
💻 Action: Start with Module 01 to build tensor operations from scratch.
# Navigate to the tensor module
cd modules/01_tensor
jupyter lab tensor_dev.py
You'll implement core tensor operations:
- N-dimensional array creation
- Basic mathematical operations (add, multiply, matmul)
- Shape manipulation (reshape, transpose)
- Memory layout understanding
Key Implementation: Build the Tensor class that forms the foundation of all neural networks
📖 See Essential Commands for module workflow commands.
✅ Achievement Unlocked: Foundation capability - "Can I create and manipulate the building blocks of ML?"
Next Step: Module 02 - Activations
🎯 Learning Goal: Add nonlinearity - the key to neural network intelligence
⏱️ Time: 10 minutes
💻 Action: Continue with Module 02 to add activation functions.
You'll implement essential activation functions:
- ReLU (Rectified Linear Unit) - the workhorse of deep learning
- Softmax - for probability distributions
- Understand gradient flow and numerical stability
- Learn why nonlinearity enables learning
Key Implementation: Build activation functions that allow neural networks to learn complex patterns
📖 See Essential Commands for module development workflow.
✅ Achievement Unlocked: Intelligence capability - "Can I add nonlinearity to enable learning?"
📊 Track Your Progress
After completing your first modules:
Check your new capabilities: Track your progress through the 21-checkpoint system to see your growing ML systems expertise.
📖 See Track Your Progress for detailed capability tracking and Essential Commands** for progress monitoring commands.
🏆 Unlock Historical Milestones
As you progress, prove what you've built by recreating history's greatest ML breakthroughs:
After Module 04: Build Rosenblatt's 1957 Perceptron - the first trainable neural network
After Module 06: Solve the 1969 XOR Crisis with multi-layer networks
After Module 08: Achieve 95%+ accuracy on MNIST with 1986 backpropagation
After Module 09: Hit 75%+ on CIFAR-10 with 1998 CNNs - your North Star goal! 🎯
📖 See Journey Through ML History for complete milestone demonstrations.
Why Milestones Matter: These aren't toy demos - they're historically significant achievements proving YOUR implementations work at production scale!
🎯 What You Just Accomplished
In 15 minutes, you've:
🔧 Setup Complete
Installed TinyTorch and verified your environment
🧱 Created Foundation
Implemented core tensor operations from scratch
🏆 First Capability
Earned your first ML systems capability checkpoint
🚀 Your Next Steps
Immediate Next Actions (Choose One):
🔥 Continue Building (Recommended): Begin Module 03 to add intelligence to your network with nonlinear activation functions.
📚 Learn the Workflow:
- 📖 See Essential Commands for complete TITO command guide
- 📖 See Track Your Progress for the full learning path
🎓 For Instructors:
- 📖 See Classroom Setup Guide for NBGrader integration and grading workflow
💡 Pro Tips for Continued Success
Essential Development Practices:
- Always verify your environment before starting
- Track your progress through capability checkpoints
- Follow the standard module development workflow
- Use diagnostic commands when debugging issues
📖 See Essential Commands for complete workflow commands and troubleshooting guide.
🌟 You're Now a TinyTorch Builder!
Ready to Build Production ML Systems
You've proven you can build ML components from scratch. Time to keep going!
Continue Building → Master Commands →What makes TinyTorch different: You're not just learning about neural networks—you're building them from fundamental mathematical operations. Every line of code you write builds toward complete ML systems mastery.
Next milestone: After Module 08, you'll train real neural networks on actual datasets using 100% your own code!