Commit Graph

4 Commits

Author SHA1 Message Date
Vijay Janapa Reddi
90581b23c0 Update test suite for module restructuring
Updated test imports and paths after modules/source/ removal:
- Progressive integration tests for modules 03, 06, 08, 13, 14
- Checkpoint integration tests
- Module completion orchestrator
- Optimizer integration tests
- Gradient flow regression tests

Updated test documentation:
- tests/README.md with new module paths
- tests/TEST_STRATEGY.md with restructuring notes

All tests now reference modules/XX_name/ instead of modules/source/.
2025-11-10 19:42:23 -05:00
Vijay Janapa Reddi
1aea4b3aba Add MSEBackward and organize comprehensive test suite
New Features:
- Add MSEBackward gradient computation for regression tasks
- Patch MSELoss in enable_autograd() for gradient tracking
- All 3 loss functions now support autograd: MSE, BCE, CrossEntropy

Test Suite Organization:
- Reorganize tests/ into focused directories
- Create tests/integration/ for cross-module tests
- Create tests/05_autograd/ for autograd edge cases
- Create tests/debugging/ for common student pitfalls
- Add comprehensive tests/README.md explaining test philosophy

Integration Tests:
- Move test_gradient_flow.py to integration/
- 20 comprehensive gradient flow tests
- Tests cover: tensors, layers, activations, losses, optimizers
- Tests validate: basic ops, chain rule, broadcasting, training loops
- 19/20 tests passing (MSE now fixed!)

Results:
 Perceptron learns: 50% → 93% accuracy
 Clean test organization guides future development
 Tests catch the exact bugs that broke training

Pedagogical Value:
- Test organization teaches testing best practices
- Gradient flow tests show what integration testing catches
- Sets foundation for debugging/diagnostic tests
2025-09-30 13:57:40 -04:00
Vijay Janapa Reddi
a9ee348355 feat: Add comprehensive integration tests for attention module
 Created test_tensor_attention_integration.py:
- Basic tensor-attention integration with real TinyTorch components
- Self-attention wrapper testing with proper Tensor objects
- Attention masking integration (causal, padding, bidirectional)
- Batched tensor processing and different data types
- Numerical stability and gradient flow compatibility

 Created test_attention_pipeline_integration.py:
- Complete transformer-like pipeline testing
- Multi-layer attention stacks (transformer encoders)
- Causal masking for language modeling workflows
- Encoder-decoder architecture integration
- Cross-module integration with dense layers and activations
- Real-world scenarios: sequence classification, seq2seq translation
- Scalability testing across different sequence lengths and dimensions

 Updated tests/README.md:
- Documented new attention integration tests (15→17 total tests)
- Organized tests by category (Foundation, Architecture, Training, Inference Serving)
- Added specific usage examples for attention tests
- Clear documentation of test coverage and purpose

Integration tests ensure:
- Attention works with real Tensor objects (not mocks)
- Cross-module compatibility with dense, spatial, activations
- Complete ML workflows (classification, translation, transformers)
- Realistic transformer architectures and patterns
- System-level regression detection for attention functionality
2025-07-18 00:21:48 -04:00
Vijay Janapa Reddi
4912f794d2 🛡️ Add protection for critical tests/ directory
- Add tests/README.md with clear warnings and recovery instructions
- Add tests/.gitkeep to ensure directory is always tracked
- Protect 15 integration test files (~100KB valuable code)
- Provide git recovery commands if accidentally deleted

Addresses risk mitigation while keeping standard Python conventions.
2025-07-15 10:03:05 -04:00