mirror of
https://github.com/harvard-edge/cs249r_book.git
synced 2026-04-30 09:38:38 -05:00
Remove test_attention_pipeline_integration.py and test_tensor_attention_integration.py which test SelfAttention, create_causal_mask, and other components that do not exist in the attention module. These were always skipped and provided no test value. The existing attention tests (test_attention_core.py) properly test the actual implemented components: scaled_dot_product_attention and MultiHeadAttention.