mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-05-06 09:22:31 -05:00
- Implemented numerically stable binary cross-entropy using log-sum-exp trick - Computes loss directly from logits without sigmoid computation - Handles extreme values (±100) correctly without overflow/underflow - All training module tests now pass successfully - Fixed issue where extreme predictions caused NaN values Technical improvements: - Uses log_sigmoid(x) = x - max(0,x) - log(1 + exp(-abs(x))) - Avoids sigmoid computation entirely for better numerical stability - Maintains mathematical correctness while preventing overflow - Perfect predictions now produce near-zero loss as expected