mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-05-06 07:53:29 -05:00
🎯 Issues Fixed: 1. MLP Architecture: Convert from function to proper class with .network, .input_size attributes 2. Polymorphic Layers: Updated Dense and Activations in exported package to preserve input types 3. Design Decision: Remove default output activation from MLP (test expects 3 layers, not 4) ✅ Impact: 04_networks external tests now pass 25/25 (was 18/25) 🔧 Technical Changes: - Convert MLP function → MLP class with attributes and .network property - Fix tinytorch.core.layers.Dense to use type(x)(result) instead of Tensor(result) - Fix tinytorch.core.activations (ReLU/Sigmoid/Tanh/Softmax) for polymorphic behavior - Set output_activation=None default for general-purpose MLP - All layers/activations now work with MockTensor for better testability This makes the networks module fully compatible with external testing frameworks and provides proper OOP design for MLP.