mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-03-11 22:03:34 -05:00
feat: Create clean modular architecture with activations → layers separation
��️ Major architectural improvement implementing clean separation of concerns: ✨ NEW: Activations Module - Complete activations module with ReLU, Sigmoid, Tanh implementations - Educational NBDev structure with student TODOs + instructor solutions - Comprehensive testing suite (24 tests) with mathematical correctness validation - Visual learning features with matplotlib plotting (disabled during testing) - Clean export to tinytorch.core.activations 🔧 REFACTOR: Layers Module - Removed duplicate activation function implementations - Clean import from activations module: 'from tinytorch.core.activations import ReLU, Sigmoid, Tanh' - Updated documentation to reflect modular architecture - Preserved all existing functionality while improving code organization 🧪 TESTING: Comprehensive Test Coverage - All 24 activations tests passing ✅ - All 17 layers tests passing ✅ - Integration tests verify clean architecture works end-to-end - CLI testing with 'tito test --module' works for both modules 📦 ARCHITECTURE: Clean Dependency Graph - activations (math functions) → layers (building blocks) → networks (applications) - Separation of concerns: pure math vs. neural network components - Reusable components across future modules - Single source of truth for activation implementations �� PEDAGOGY: Enhanced Learning Experience - Week-sized chunks: students master activations, then build layers - Clear progression from mathematical foundations to applications - Real-world software architecture patterns - Modular design principles in practice This establishes the foundation for scalable, maintainable ML systems education.
This commit is contained in:
@@ -343,7 +343,7 @@ def cmd_info(args):
|
||||
|
||||
def cmd_test(args):
|
||||
"""Run tests for a specific module."""
|
||||
valid_modules = ["setup", "tensor", "layers", "cnn", "data", "training",
|
||||
valid_modules = ["setup", "tensor", "activations", "layers", "cnn", "data", "training",
|
||||
"profiling", "compression", "kernels", "benchmarking", "mlops"]
|
||||
|
||||
if args.all:
|
||||
|
||||
Reference in New Issue
Block a user