Files
TinyTorch/modules/source/16_tinygpt/module.yaml
Vijay Janapa Reddi d04d66a716 Implement interactive ML Systems questions and standardize module structure
Major Educational Framework Enhancements:
• Deploy interactive NBGrader text response questions across ALL modules
• Replace passive question lists with active 150-300 word student responses
• Enable comprehensive ML Systems learning assessment and grading

TinyGPT Integration (Module 16):
• Complete TinyGPT implementation showing 70% component reuse from TinyTorch
• Demonstrates vision-to-language framework generalization principles
• Full transformer architecture with attention, tokenization, and generation
• Shakespeare demo showing autoregressive text generation capabilities

Module Structure Standardization:
• Fix section ordering across all modules: Tests → Questions → Summary
• Ensure Module Summary is always the final section for consistency
• Standardize comprehensive testing patterns before educational content

Interactive Question Implementation:
• 3 focused questions per module replacing 10-15 passive questions
• NBGrader integration with manual grading workflow for text responses
• Questions target ML Systems thinking: scaling, deployment, optimization
• Cumulative knowledge building across the 16-module progression

Technical Infrastructure:
• TPM agent for coordinated multi-agent development workflows
• Enhanced documentation with pedagogical design principles
• Updated book structure to include TinyGPT as capstone demonstration
• Comprehensive QA validation of all module structures

Framework Design Insights:
• Mathematical unity: Dense layers power both vision and language models
• Attention as key innovation for sequential relationship modeling
• Production-ready patterns: training loops, optimization, evaluation
• System-level thinking: memory, performance, scaling considerations

Educational Impact:
• Transform passive learning to active engagement through written responses
• Enable instructors to assess deep ML Systems understanding
• Provide clear progression from foundations to complete language models
• Demonstrate real-world framework design principles and trade-offs
2025-09-17 14:42:24 -04:00

36 lines
969 B
YAML

# TinyTorch Module Metadata
# Essential system information for CLI tools and build systems
name: "tinygpt"
title: "TinyGPT - Language Models"
description: "Build GPT-style transformer models for language understanding using TinyTorch"
# Dependencies - Used by CLI for module ordering and prerequisites
dependencies:
prerequisites: [
"setup", "tensor", "activations", "layers", "dense", "spatial", "attention",
"autograd", "optimizers", "training"
]
enables: []
# Package Export - What gets built into tinytorch package
exports_to: "tinytorch.tinygpt"
# File Structure - What files exist in this module
files:
dev_file: "tinygpt_dev.py"
readme: "README.md"
tests: "inline"
# Educational Metadata
difficulty: "⭐⭐⭐⭐⭐"
time_estimate: "4-6 hours"
# Components - What's implemented in this module
components:
- "CharTokenizer"
- "MultiHeadAttention"
- "TransformerBlock"
- "TinyGPT"
- "LanguageModelTrainer"
- "TextGeneration"