mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-03-09 10:12:08 -05:00
New Features: - Add MSEBackward gradient computation for regression tasks - Patch MSELoss in enable_autograd() for gradient tracking - All 3 loss functions now support autograd: MSE, BCE, CrossEntropy Test Suite Organization: - Reorganize tests/ into focused directories - Create tests/integration/ for cross-module tests - Create tests/05_autograd/ for autograd edge cases - Create tests/debugging/ for common student pitfalls - Add comprehensive tests/README.md explaining test philosophy Integration Tests: - Move test_gradient_flow.py to integration/ - 20 comprehensive gradient flow tests - Tests cover: tensors, layers, activations, losses, optimizers - Tests validate: basic ops, chain rule, broadcasting, training loops - 19/20 tests passing (MSE now fixed!) Results: ✅ Perceptron learns: 50% → 93% accuracy ✅ Clean test organization guides future development ✅ Tests catch the exact bugs that broke training Pedagogical Value: - Test organization teaches testing best practices - Gradient flow tests show what integration testing catches - Sets foundation for debugging/diagnostic tests
13 lines
478 B
Python
13 lines
478 B
Python
"""
|
|
Debugging tests for common student pitfalls.
|
|
|
|
These tests identify and diagnose common issues students encounter:
|
|
- Vanishing gradients (ReLU dying, sigmoid saturation)
|
|
- Exploding gradients (unstable initialization)
|
|
- Silent failures (forgot backward(), forgot zero_grad())
|
|
- Common mistakes (wrong loss function, learning rate issues)
|
|
|
|
Goal: When a test fails, the error message should guide students
|
|
to the solution. These are pedagogical tests that teach debugging.
|
|
"""
|