Files
TinyTorch/modules/source/08_optimizers/module.yaml
Vijay Janapa Reddi 06ca2ee802 Standardize module.yaml files for instructor/staff workflow
- Remove student-facing bloat (learning objectives, time estimates, pedagogical details)
- Remove assessment sections (not needed for operational metadata)
- Streamline to essential system information only:
  - Module identification and dependencies
  - Package export configuration
  - File structure and component listings

- Updated existing files (6): setup, tensor, activations, layers, autograd, optimizers
- Created missing files (3): networks, cnn, dataloader
- Consistent 25-26 line format across all 9 modules

Result: Pure operational metadata for CLI tools and build systems
Perfect for instructor/staff development workflow
2025-07-14 00:08:05 -04:00

27 lines
728 B
YAML

# TinyTorch Module Metadata
# Essential system information for CLI tools and build systems
name: "optimizers"
title: "Optimizers"
description: "Gradient-based parameter optimization algorithms"
# Dependencies - Used by CLI for module ordering and prerequisites
dependencies:
prerequisites: ["setup", "tensor", "autograd"]
enables: ["training", "compression", "mlops"]
# Package Export - What gets built into tinytorch package
exports_to: "tinytorch.core.optimizers"
# File Structure - What files exist in this module
files:
dev_file: "optimizers_dev.py"
readme: "README.md"
tests: "inline"
# Components - What's implemented in this module
components:
- "SGD"
- "Adam"
- "StepLR"
- "gradient_descent_step"