mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-05-07 12:02:33 -05:00
🔧 Issues Fixed: 1. MockTensor compatibility: Activations now return same type as input (polymorphic) 2. Empty input handling: Softmax gracefully handles zero-size arrays ✅ Impact: 02_activations external tests now pass 34/34 (was 32/34) 🎯 Technical Changes: - Changed activation signatures from Tensor -> Tensor to flexible types - Use type(x)(result) instead of hardcoded Tensor(result) - Added empty input guard in Softmax: if x.data.size == 0: return type(x)(x.data.copy()) - Applied consistent pattern across ReLU, Sigmoid, Tanh, Softmax This makes activations more robust and testable without tight coupling to Tensor implementation.