chore(tinytorch): bump version to v0.1.4

TinyTorch v0.1.4: Educational improvements and module path fixes

Breaking Changes:
- fix: correct module path from core.transformer to core.transformers (14 files)

Educational Enhancements:
- refactor: remove premature backward() methods for cleaner progressive learning
- feat: add educational scaffolding with TODO/hints in Module 20 Capstone
- docs: remove forward references to Module 06 in early modules

Bug Fixes:
- fix: TransformerBlock now supports ff_dim parameter for flexibility
- fix: wrap module print statements in if __name__ guards

Code Quality:
- refactor: reorganize Quantizer class export location
- refactor: improve module integration in tinytorch.__init__.py
- chore: remove outdated TINYTORCH_FORMATTING_STANDARDS.md (415 lines)

Stats: 29 files changed, 357 insertions(+), 711 deletions(-)
This commit is contained in:
Vijay Janapa Reddi
2026-01-17 10:25:59 -05:00
parent be13cb552a
commit c420fe7858
29 changed files with 357 additions and 711 deletions

View File

@@ -137,7 +137,7 @@ def test_regression_layernorm_gradient_flow():
"""
print("Testing regression: LayerNorm gradient flow...")
from tinytorch.core.transformer import LayerNorm
from tinytorch.core.transformers import LayerNorm
ln = LayerNorm(4)
ln.gamma.requires_grad = True