mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-05-02 11:23:09 -05:00
Fix nbdev export system across all 20 modules
PROBLEM: - nbdev requires #| export directive on EACH cell to export when using # %% markers - Cell markers inside class definitions split classes across multiple cells - Only partial classes were being exported to tinytorch package - Missing matmul, arithmetic operations, and activation classes in exports SOLUTION: 1. Removed # %% cell markers INSIDE class definitions (kept classes as single units) 2. Added #| export to imports cell at top of each module 3. Added #| export before each exportable class definition in all 20 modules 4. Added __call__ method to Sigmoid for functional usage 5. Fixed numpy import (moved to module level from __init__) MODULES FIXED: - 01_tensor: Tensor class with all operations (matmul, arithmetic, shape ops) - 02_activations: Sigmoid, ReLU, Tanh, GELU, Softmax classes - 03_layers: Linear, Dropout classes - 04_losses: MSELoss, CrossEntropyLoss, BinaryCrossEntropyLoss classes - 05_autograd: Function, AddBackward, MulBackward, MatmulBackward, SumBackward - 06_optimizers: Optimizer, SGD, Adam, AdamW classes - 07_training: CosineSchedule, Trainer classes - 08_dataloader: Dataset, TensorDataset, DataLoader classes - 09_spatial: Conv2d, MaxPool2d, AvgPool2d, SimpleCNN classes - 10-20: All exportable classes in remaining modules TESTING: - Test functions use 'if __name__ == "__main__"' guards - Tests run in notebooks but NOT on import - Rosenblatt Perceptron milestone working perfectly RESULT: ✅ All 20 modules export correctly ✅ Perceptron (1957) milestone functional ✅ Clean separation: development (modules/source) vs package (tinytorch)
This commit is contained in:
@@ -778,6 +778,7 @@ Regular Linear Layer: QuantizedLinear Layer:
|
||||
"""
|
||||
|
||||
# %% nbgrader={"grade": false, "grade_id": "quantized_linear", "solution": true}
|
||||
#| export
|
||||
class QuantizedLinear:
|
||||
"""Quantized version of Linear layer using INT8 arithmetic."""
|
||||
|
||||
|
||||
Reference in New Issue
Block a user