mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-05-08 12:12:33 -05:00
- Add polymorphic Dense layer supporting both Tensor and Variable inputs - Implement gradient-aware matrix multiplication with proper backward functions - Preserve autograd chain through layer computations while maintaining backward compatibility - Add comprehensive tests for Tensor/Variable interoperability - Enable end-to-end neural network training with gradient flow Educational benefits: - Students can use layers in both inference (Tensor) and training (Variable) modes - Autograd integration happens transparently without API changes - Maintains clear separation between concepts while enabling practical usage