mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-04-25 18:37:46 -05:00
✅ NBGRADER WORKFLOW RESTORED: - Restored assignments/ directory with 6 source assignments - Restored nbgrader_config.py and gradebook.db - Restored tito/commands/nbgrader.py for full NBGrader integration - Restored bin/generate_student_notebooks.py 🧹 CLEANUP COMPLETED: - Removed outdated tests/ directory (less comprehensive than module tests) - Cleaned up Python cache files (__pycache__) - Removed .pytest_cache directory - Preserved all essential functionality 📚 DOCUMENTATION UPDATED: - Added NBGrader workflow to INSTRUCTOR_GUIDE.md - Updated README.md with NBGrader integration info - Clear instructor workflow: Create solutions → Generate student versions → Release → Grade ✅ VERIFIED WORKING: - tito nbgrader generate 00_setup ✅ - tito nbgrader status ✅ - tito system doctor ✅ - Module tests still pass ✅ 🎯 INSTRUCTOR WORKFLOW NOW COMPLETE: 1. Create instructor solutions in modules/XX/XX_dev.py 2. Generate student versions: tito nbgrader generate XX 3. Release assignments: tito nbgrader release XX 4. Collect & grade: tito nbgrader collect XX && tito nbgrader autograde XX Repository now properly supports full instructor → student workflow with NBGrader ?
📚 TinyTorch Documentation
Complete documentation for the TinyTorch ML Systems course.
🎯 Quick Start Navigation
For Instructors 👨🏫
- 📖 Instructor Guide - Complete teaching guide with verified modules, commands, and class structure
- 🎓 Pedagogy - Educational principles and course philosophy
For Students 👨🎓
- 🔥 Student Guide - Complete course navigation and learning path
- 📚 Module README - Individual module instructions and status
For Developers 👨💻
- 🛠️ Development - Module creation and contribution guidelines
📊 Current Course Status
✅ Ready for Students (6+ weeks of content)
- 00_setup (20/20 tests) - Development workflow & CLI tools
- 02_activations (24/24 tests) - ReLU, Sigmoid, Tanh functions
- 03_layers (17/22 tests) - Dense layers & neural building blocks
- 04_networks (20/25 tests) - Sequential networks & MLPs
- 06_dataloader (15/15 tests) - CIFAR-10 data loading
- 05_cnn (2/2 tests) - Convolution operations
🚧 In Development
- 01_tensor (22/33 tests) - Tensor arithmetic (partially working)
- 07-13 - Advanced features (autograd, training, MLOps)
🚀 Quick Commands
System Status
tito system info # Check system and module status
tito system doctor # Verify environment setup
tito module status # View all module progress
Module Development
cd modules/00_setup # Navigate to module
jupyter lab setup_dev.py # Open development notebook
python -m pytest tests/ -v # Run module tests
python bin/tito module export 00_setup # Export to package
Package Usage
# Use student implementations
python -c "from tinytorch.core.utils import hello_tinytorch; hello_tinytorch()"
python -c "from tinytorch.core.activations import ReLU; print(ReLU()([-1, 0, 1]))"
🎓 Educational Philosophy
Build → Use → Understand → Repeat
Students implement ML components from scratch, then immediately use their implementations:
- Build: Implement
ReLU()function - Use: Import
from tinytorch.core.activations import ReLU - Understand: See how it works in real networks
- Repeat: Each module builds on previous work
Real Data, Real Systems
- Work with CIFAR-10 (not toy datasets)
- Production-style code organization
- Performance and engineering considerations
Immediate Feedback
- Tests provide instant verification
- Students see their code working quickly
- Progress is visible and measurable
📁 Documentation Structure
Quick Reference
- INSTRUCTOR_GUIDE.md - Complete teaching guide
- STUDENT_GUIDE.md - Complete learning path
Detailed Guides
- pedagogy/ - Educational principles and course philosophy
- development/ - Module creation and development guidelines
Legacy Documentation
The development/ directory contains detailed module creation guides that were used to build the current working modules. This documentation is preserved for reference but the main teaching workflow is now covered in the Instructor and Student guides.
🌟 Success Metrics
Working Capabilities
Students can currently:
- Build and test multi-layer perceptrons
- Implement custom activation functions
- Load and process CIFAR-10 data
- Create basic convolution operations
- Export their code to a working package
Verified Workflows
- ✅ Instructor Journey: develop → export → test → package
- ✅ Student Journey: import → use → build → understand
- ✅ Package Integration: All core imports work correctly
🔧 Technical Details
Module Structure
Each module follows this pattern:
modules/XX_name/- Module directoryXX_name_dev.py- Development notebook (Jupytext format)tests/- Comprehensive test suiteREADME.md- Module-specific instructions
Export System
- Students develop in
XX_name_dev.py - Export to
tinytorch.core.XX_namepackage - Import and use their implementations immediately
🚀 Getting Started
Instructors
- Read the Instructor Guide
- Verify your system:
tito system doctor - Test the first module:
cd modules/00_setup && jupyter lab setup_dev.py
Students
- Read the Student Guide
- Start with:
cd modules/00_setup && jupyter lab setup_dev.py - Follow the 5-step workflow for each module
Developers
- Review the development/ directory
- Follow existing module patterns
- Test thoroughly before contributing
🎉 TinyTorch is ready for classroom use with 6+ weeks of proven curriculum content!