Files
TinyTorch/modules/06_optimizers/module.yaml
Vijay Janapa Reddi 45a9cef548 Major reorganization: Remove setup module, renumber all modules, add tito setup command and numeric shortcuts
- Removed 01_setup module (archived to archive/setup_module)
- Renumbered all modules: tensor is now 01, activations is 02, etc.
- Added tito setup command for environment setup and package installation
- Added numeric shortcuts: tito 01, tito 02, etc. for quick module access
- Fixed view command to find dev files correctly
- Updated module dependencies and references
- Improved user experience: immediate ML learning instead of boring setup
2025-09-28 07:02:08 -04:00

24 lines
435 B
YAML

components:
- SGD
- Adam
- StepLR
- gradient_descent_step
dependencies:
enables:
- training
- compression
- mlops
prerequisites:
- tensor
- autograd
description: Gradient-based parameter optimization algorithms
difficulty: "\u2B50\u2B50\u2B50\u2B50"
exports_to: tinytorch.core.optimizers
files:
dev_file: optimizers_dev.py
readme: README.md
tests: inline
name: optimizers
time_estimate: 6-8 hours
title: Optimizers