Files
TinyTorch/modules
Vijay Janapa Reddi 3e29b69ca8 Fix Tensor slicing gradient tracking - position embeddings now learn
CRITICAL FIX: Monkey-patching for __getitem__ was not in source modules

PROBLEM:
- Previously modified tinytorch/core/autograd.py (compiled output)
- But NOT modules/05_autograd/autograd.py (source)
- Export regenerated compiled files WITHOUT the monkey-patching code
- Result: Tensor slicing had NO gradient tracking

SOLUTION:
1. Added tracked_getitem() to modules/05_autograd/autograd.py
2. Added _original_getitem store in enable_autograd()
3. Added Tensor.__getitem__ = tracked_getitem installation
4. Exported all modules (tensor, autograd, embeddings)

VERIFICATION TESTS:
 Tensor slicing attaches SliceBackward
 Gradients flow correctly: x[:3].backward() → x.grad = [1,1,1,0,0]
 Position embeddings.grad is not None and has non-zero values
 All 19/19 parameters get gradients and update

TRAINING RESULTS:
- Loss drops: 1.58 → 1.26 (vs 1.62→1.24 before)
- Training accuracy: 2.7% (vs 0% before)
- Test accuracy: Still 0% (needs hyperparameter tuning)

MODEL IS LEARNING (slightly) - this is progress!

Next steps: Hyperparameter tuning (more epochs, different LR, larger model)
2025-11-22 18:29:38 -05:00
..