Files
TinyTorch/tinytorch/optim/__init__.py
Vijay Janapa Reddi c955437078 Organize package with nn and optim modules
Stage 5 of TinyTorch API simplification:
- Created tinytorch.nn package with PyTorch-compatible interface
- Added Module base class in nn.modules for automatic parameter registration
- Added functional module with relu, flatten, max_pool2d operations
- Created tinytorch.optim package exposing Adam and SGD optimizers
- Updated main __init__.py to export nn and optim modules
- Linear and Conv2d now available through clean nn interface

Students can now write PyTorch-like code:
import tinytorch.nn as nn
import tinytorch.nn.functional as F
model = nn.Linear(784, 10)
x = F.relu(model(x))
2025-09-23 08:10:47 -04:00

42 lines
1.1 KiB
Python
Generated

"""
TinyTorch Optimization Module (optim)
This package provides PyTorch-compatible optimizers for training neural networks.
Optimizers:
- Adam: Adaptive moment estimation optimizer
- SGD: Stochastic gradient descent
Example Usage:
import tinytorch.nn as nn
import tinytorch.optim as optim
model = nn.Linear(784, 10)
optimizer = optim.Adam(model.parameters(), lr=0.001)
# Training loop
for epoch in range(num_epochs):
for batch in dataloader:
# Forward pass
output = model(batch.data)
loss = criterion(output, batch.targets)
# Backward pass
loss.backward()
# Update parameters
optimizer.step()
optimizer.zero_grad()
The optimizers work with any Module that implements parameters() method,
providing the clean training interface students expect.
"""
# Import optimizers from core (these contain the student implementations)
from ..core.optimizers import Adam, SGD
# Export the main public API
__all__ = [
'Adam',
'SGD'
]