Files
TinyTorch/modules
Vijay Janapa Reddi 885c211b15 Fix: Numerical stability in BinaryCrossEntropyLoss
- Implemented numerically stable binary cross-entropy using log-sum-exp trick
- Computes loss directly from logits without sigmoid computation
- Handles extreme values (±100) correctly without overflow/underflow
- All training module tests now pass successfully
- Fixed issue where extreme predictions caused NaN values

Technical improvements:
- Uses log_sigmoid(x) = x - max(0,x) - log(1 + exp(-abs(x)))
- Avoids sigmoid computation entirely for better numerical stability
- Maintains mathematical correctness while preventing overflow
- Perfect predictions now produce near-zero loss as expected
2025-07-14 00:48:08 -04:00
..