mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-05-06 05:47:31 -05:00
New Features: - Add MSEBackward gradient computation for regression tasks - Patch MSELoss in enable_autograd() for gradient tracking - All 3 loss functions now support autograd: MSE, BCE, CrossEntropy Test Suite Organization: - Reorganize tests/ into focused directories - Create tests/integration/ for cross-module tests - Create tests/05_autograd/ for autograd edge cases - Create tests/debugging/ for common student pitfalls - Add comprehensive tests/README.md explaining test philosophy Integration Tests: - Move test_gradient_flow.py to integration/ - 20 comprehensive gradient flow tests - Tests cover: tensors, layers, activations, losses, optimizers - Tests validate: basic ops, chain rule, broadcasting, training loops - 19/20 tests passing (MSE now fixed!) Results: ✅ Perceptron learns: 50% → 93% accuracy ✅ Clean test organization guides future development ✅ Tests catch the exact bugs that broke training Pedagogical Value: - Test organization teaches testing best practices - Gradient flow tests show what integration testing catches - Sets foundation for debugging/diagnostic tests
14 lines
414 B
Python
14 lines
414 B
Python
"""
|
|
Integration tests for TinyTorch.
|
|
|
|
These tests validate that multiple modules work together correctly.
|
|
They catch issues that unit tests miss, like:
|
|
- Gradient flow through entire training pipelines
|
|
- Module compatibility and interface contracts
|
|
- End-to-end training scenarios
|
|
|
|
Critical for catching bugs like:
|
|
- Missing autograd integration
|
|
- Shape mismatches in broadcasting
|
|
- Optimizer parameter updates
|
|
""" |