Vijay Janapa Reddi
e779d67dcf
feat: Add comprehensive integration tests for attention module
...
✅ Created test_tensor_attention_integration.py:
- Basic tensor-attention integration with real TinyTorch components
- Self-attention wrapper testing with proper Tensor objects
- Attention masking integration (causal, padding, bidirectional)
- Batched tensor processing and different data types
- Numerical stability and gradient flow compatibility
✅ Created test_attention_pipeline_integration.py:
- Complete transformer-like pipeline testing
- Multi-layer attention stacks (transformer encoders)
- Causal masking for language modeling workflows
- Encoder-decoder architecture integration
- Cross-module integration with dense layers and activations
- Real-world scenarios: sequence classification, seq2seq translation
- Scalability testing across different sequence lengths and dimensions
✅ Updated tests/README.md:
- Documented new attention integration tests (15→17 total tests)
- Organized tests by category (Foundation, Architecture, Training, Inference Serving)
- Added specific usage examples for attention tests
- Clear documentation of test coverage and purpose
Integration tests ensure:
- Attention works with real Tensor objects (not mocks)
- Cross-module compatibility with dense, spatial, activations
- Complete ML workflows (classification, translation, transformers)
- Realistic transformer architectures and patterns
- System-level regression detection for attention functionality
2025-07-18 00:21:48 -04:00