mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-05-10 16:38:39 -05:00
d14f92a9b21b327ed69572b86e8e76c828bc6323
MAJOR IMPROVEMENT: Simplified test discovery logic - Removed restrictive valid_patterns requirement from testing framework - Any function starting with 'test_' is now automatically discovered - Follows standard pytest conventions - no maintenance overhead - Eliminates need to manually add patterns for new test functions CLEANED UP: Test function names across all 10 modules - Removed redundant '_comprehensive' suffix from all test functions - Updated 40+ test function names to be more concise and readable: * 00_setup: 6 functions (test_personal_info, test_system_info, etc.) * 01_tensor: 4 functions (test_tensor_creation, test_tensor_properties, etc.) * 02_activations: 1 function (test_activations) * 03_layers: 3 functions (test_matrix_multiplication, test_dense_layer, etc.) * 04_networks: 4 functions (test_sequential_networks, test_mlp_creation, etc.) * 05_cnn: 3 functions (test_convolution_operation, test_conv2d_layer, etc.) * 06_dataloader: 4 functions (test_dataset_interface, test_dataloader, etc.) * 07_autograd: 6 functions (test_variable_class, test_add_operation, etc.) * 08_optimizers: 5 functions (test_gradient_descent_step, test_sgd_optimizer, etc.) * 09_training: 6 functions (test_mse_loss, test_crossentropy_loss, etc.) * 10_compression: 6 functions (already cleaned up) VERIFICATION: All tests still pass - All 10 modules tested successfully with new discovery logic - Total test count maintained: 47 inline tests across all modules - No functionality lost, only improved maintainability RESULT: Much cleaner, more maintainable testing framework following standard conventions
🔥 TinyTorch: Build ML Systems from Scratch
A complete Machine Learning Systems course where students build their own ML framework.
🎯 What You'll Build
- Complete ML Framework: Build your own PyTorch-style framework from scratch
- Real Applications: Use your framework to classify CIFAR-10 images
- Production Skills: Learn ML systems engineering, not just algorithms
- Immediate Feedback: See your code working at every step
🚀 Quick Start (2 minutes)
Students
git clone https://github.com/your-org/tinytorch.git
cd TinyTorch
make install # Install dependencies
tito system doctor # Verify setup
cd assignments/source/00_setup # Start with setup
jupyter lab setup_dev.py # Open first assignment
Instructors
# System check
tito system info # Check course status
tito system doctor # Verify environment
# Assignment management
tito nbgrader generate 00_setup # Create student assignments
tito nbgrader release 00_setup # Release to students
tito nbgrader autograde 00_setup # Auto-grade submissions
📚 Course Structure
Core Assignments (6+ weeks of proven content)
- 00_setup (20/20 tests) - Development workflow & CLI tools
- 02_activations (24/24 tests) - ReLU, Sigmoid, Tanh functions
- 03_layers (17/22 tests) - Dense layers & neural building blocks
- 04_networks (20/25 tests) - Sequential networks & MLPs
- 06_dataloader (15/15 tests) - CIFAR-10 data loading
- 05_cnn (2/2 tests) - Convolution operations
Advanced Features (in development)
- 01_tensor (22/33 tests) - Tensor arithmetic
- 07-13 - Autograd, optimizers, training, MLOps
🛠️ Development Workflow
NBGrader (Assignment Creation & Testing)
tito nbgrader generate 00_setup # Create student assignments
tito nbgrader release 00_setup # Release to students
tito nbgrader collect 00_setup # Collect submissions
tito nbgrader autograde 00_setup # Auto-grade with pytest
nbdev (Package Export & Building)
tito module export 00_setup # Export to tinytorch package
tito module test 00_setup # Test package integration
📈 Student Success Path
Build → Use → Understand → Repeat
- Build: Implement
ReLU()function from scratch - Use:
from tinytorch.core.activations import ReLU- your own code! - Understand: See how it works in real neural networks
- Repeat: Each assignment builds on previous work
Example: First Assignment
# You implement this:
def hello_tinytorch():
print("Welcome to TinyTorch!")
# Then immediately use it:
from tinytorch.core.utils import hello_tinytorch
hello_tinytorch() # Your code working!
🎓 Educational Philosophy
Real Data, Real Systems
- Work with CIFAR-10 (not toy datasets)
- Production-style code organization
- Performance and engineering considerations
- Immediate visual feedback
Build Everything from Scratch
- No black boxes or "magic" functions
- Understanding through implementation
- Connect every concept to production systems
- See your code working immediately
📁 Repository Structure
TinyTorch/
├── assignments/source/XX/ # Assignment source files
│ ├── XX_dev.py # Development assignment
│ └── tests/ # Assignment tests
├── tinytorch/ # Your built framework
│ └── core/ # Exported student code
├── tito/ # CLI tools
└── docs/ # Documentation
🔧 Technical Requirements
- Python 3.8+
- Jupyter Lab for development
- PyTorch for comparison and final projects
- NBGrader for assignment management
- nbdev for package building
🎯 Getting Started
Students
- System Check:
tito system doctor - First Assignment:
cd assignments/source/00_setup && jupyter lab setup_dev.py - Build & Test: Follow the notebook, export when complete
- Use Your Code:
from tinytorch.core.utils import hello_tinytorch
Instructors
- Course Status:
tito system info - Assignment Management:
tito nbgrader generate 00_setup - Student Release:
tito nbgrader release 00_setup - Auto-grading:
tito nbgrader autograde 00_setup
📊 Success Metrics
Students can currently:
- Build and test multi-layer perceptrons
- Implement custom activation functions
- Load and process CIFAR-10 data
- Create basic convolution operations
- Export their code to a working package
Verified workflows:
- ✅ Student Journey: receive assignment → implement → export → use
- ✅ Instructor Journey: create → release → collect → grade
- ✅ Package Integration: All core imports work correctly
🎉 TinyTorch is ready for classroom use with 6+ weeks of proven curriculum content!
Languages
Python
84.5%
Jupyter Notebook
7.4%
HTML
2.8%
TeX
2.2%
JavaScript
1.3%
Other
1.8%