- Change from .data extraction to Tensor arithmetic (x - mean, diff * diff, x / std)
- Preserve computation graph through normalization
- std tensor now preserves requires_grad correctly
LayerNorm is used before and after attention in transformer blocks