mirror of
https://github.com/harvard-edge/cs249r_book.git
synced 2026-05-01 18:19:18 -05:00
chore(tinytorch): bump version to v0.1.4
TinyTorch v0.1.4: Educational improvements and module path fixes Breaking Changes: - fix: correct module path from core.transformer to core.transformers (14 files) Educational Enhancements: - refactor: remove premature backward() methods for cleaner progressive learning - feat: add educational scaffolding with TODO/hints in Module 20 Capstone - docs: remove forward references to Module 06 in early modules Bug Fixes: - fix: TransformerBlock now supports ff_dim parameter for flexibility - fix: wrap module print statements in if __name__ guards Code Quality: - refactor: reorganize Quantizer class export location - refactor: improve module integration in tinytorch.__init__.py - chore: remove outdated TINYTORCH_FORMATTING_STANDARDS.md (415 lines) Stats: 29 files changed, 357 insertions(+), 711 deletions(-)
This commit is contained in:
@@ -137,7 +137,7 @@ def test_regression_layernorm_gradient_flow():
|
||||
"""
|
||||
print("Testing regression: LayerNorm gradient flow...")
|
||||
|
||||
from tinytorch.core.transformer import LayerNorm
|
||||
from tinytorch.core.transformers import LayerNorm
|
||||
|
||||
ln = LayerNorm(4)
|
||||
ln.gamma.requires_grad = True
|
||||
|
||||
Reference in New Issue
Block a user