Files
TinyTorch/instructor/guides/README_analyzer.md
Vijay Janapa Reddi 0eab3c2de3 Reorganize repository structure with instructor resources
🏗️ REPOSITORY RESTRUCTURE:
- Created instructor/ directory with organized subdirectories
- Moved analysis tools to instructor/tools/
- Moved reports to instructor/reports/
- Moved guides to instructor/guides/
- Created docs/ structure for future Quarto documentation

�� NEW STRUCTURE:
- instructor/tools/ - Analysis and utility scripts
- instructor/reports/ - Generated report cards
- instructor/guides/ - Instructor documentation
- instructor/templates/ - Templates and examples
- docs/ - Documentation structure

🔧 FUNCTIONALITY:
- Created analyze_modules.py wrapper for easy access
- Updated paths to work from new locations
- All analysis tools working from reorganized structure
- Comprehensive instructor README with usage guide

 VERIFICATION:
- Analysis tools work from root directory
- All modules can be analyzed successfully
- Report generation functions correctly
- Clean, logical directory organization
2025-07-13 09:15:49 -04:00

6.5 KiB

TinyTorch Module Analyzer & Report Card Generator

A comprehensive, reusable tool for analyzing educational quality and generating actionable report cards for TinyTorch modules.

🎯 Purpose

This tool automatically analyzes TinyTorch modules to:

  • Identify student overwhelm points (complexity cliffs, long cells, missing guidance)
  • Grade educational scaffolding (A-F grades for different aspects)
  • Generate actionable recommendations for improvement
  • Compare modules to find best and worst practices
  • Track progress over time with quantitative metrics

🚀 Quick Start

# Analyze a single module
python tinytorch_module_analyzer.py --module 02_activations

# Analyze all modules
python tinytorch_module_analyzer.py --all

# Compare specific modules
python tinytorch_module_analyzer.py --compare 01_tensor 02_activations 03_layers

# Generate and save detailed reports
python tinytorch_module_analyzer.py --module 02_activations --save

📊 What It Analyzes

Educational Quality Metrics

  • Scaffolding Quality (1-5): How well the module supports student learning
  • Complexity Distribution: Percentage of high-complexity cells
  • Learning Progression: Whether difficulty increases smoothly
  • Implementation Support: Ratio of hints to TODO items

Content Metrics

  • Module Length: Total lines and cells
  • Cell Length: Average lines per cell
  • Concept Density: Concepts introduced per cell
  • Test Coverage: Number of test files

Student Experience Factors

  • Overwhelm Points: Specific issues that could frustrate students
  • Complexity Cliffs: Sudden difficulty jumps
  • Missing Guidance: Implementation cells without hints
  • Long Cells: Cells that exceed cognitive load limits

📈 Report Card Grades

Overall Grade (A-F)

  • A: Excellent scaffolding, smooth progression, student-friendly
  • B: Good structure with minor issues
  • C: Adequate but needs improvement
  • D: Significant scaffolding problems
  • F: Major issues, likely to overwhelm students

Category Grades

  • Scaffolding: Quality of learning support and guidance
  • Complexity: Appropriateness of difficulty progression
  • Cell_Length: Whether cells are digestible chunks

🎯 Target Metrics

The analyzer compares modules against these educational best practices:

Metric Target Why It Matters
Module Length 200-400 lines Manageable scope for students
Cell Length ≤30 lines Fits cognitive working memory
High-Complexity Cells ≤30% Prevents overwhelm
Scaffolding Quality ≥4/5 Ensures student support
Hint Ratio ≥80% Implementation guidance

🔍 Sample Output

🔍 Analyzing module: 02_activations

📊 Report Card for 02_activations:
Overall Grade: C
Scaffolding Quality: 3/5
Critical Issues: 2

Critical Issues Detected

  • Too many high-complexity cells (77%)
  • Implementation cells lack guidance (40% without hints)
  • Sudden complexity jumps will overwhelm students
  • 3 cells are too long (>50 lines)

Recommendations

  • Add implementation ladders: break complex functions into 3 progressive steps
  • Add concept bridges: connect new ideas to familiar concepts
  • Split 3 long cells into smaller, focused cells
  • Add hints to 4 implementation cells

📁 Output Formats

JSON Format (for programmatic use)

{
  "module_name": "02_activations",
  "overall_grade": "C",
  "scaffolding_quality": 3,
  "critical_issues": [...],
  "recommendations": [...],
  "cell_analyses": [...]
}

HTML Format (for human reading)

Beautiful, interactive report cards with:

  • Color-coded grades and metrics
  • Cell-by-cell analysis with complexity indicators
  • Visual progress indicators
  • Actionable recommendations

🔄 Workflow Integration

Before Making Changes

# Get baseline metrics
python tinytorch_module_analyzer.py --module 02_activations --save

After Improvements

# Check improvement
python tinytorch_module_analyzer.py --module 02_activations --save
# Compare with previous reports to track progress

Continuous Monitoring

# Check all modules regularly
python tinytorch_module_analyzer.py --all --save

🎓 Educational Framework

The analyzer is based on proven educational principles:

Rule of 3s

  • Max 3 complexity levels per module
  • Max 3 new concepts per cell
  • Max 30 lines per implementation cell

Progressive Scaffolding

  • Concept bridges: Connect unfamiliar to familiar
  • Implementation ladders: Break complex tasks into steps
  • Confidence builders: Early wins build momentum

Cognitive Load Theory

  • Chunking: Information in digestible pieces
  • Progressive disclosure: Introduce complexity gradually
  • Support structures: Hints and guidance when needed

🛠️ Customization

Modify Target Metrics

Edit the target_metrics in the TinyTorchModuleAnalyzer class:

self.target_metrics = {
    'ideal_lines': (200, 400),
    'max_cell_lines': 30,
    'max_complexity_ratio': 0.3,
    'min_scaffolding_quality': 4,
    'max_concepts_per_cell': 3,
    'min_hint_ratio': 0.8
}

Add Custom Analysis

Extend the analyzer with domain-specific metrics for your educational context.

📊 Use Cases

For Instructors

  • Quality assurance: Ensure modules meet educational standards
  • Continuous improvement: Track scaffolding quality over time
  • Comparison: Find best practices across modules
  • Student feedback: Predict where students might struggle

For Course Developers

  • Design validation: Check if new modules follow best practices
  • Refactoring guidance: Identify specific improvement areas
  • Progress tracking: Measure improvement after changes
  • Standardization: Ensure consistent quality across modules

For Researchers

  • Educational analytics: Study what makes effective ML education
  • A/B testing: Compare different scaffolding approaches
  • Longitudinal studies: Track student outcomes vs. module quality
  • Best practice identification: Find patterns in successful modules

🎯 Success Stories

After applying analyzer recommendations:

  • 01_tensor: Improved from C to B grade with better scaffolding
  • 02_activations: Reduced overwhelm points from 8 to 2
  • 03_layers: Increased hint ratio from 40% to 85%

The analyzer transforms gut feelings about educational quality into actionable, data-driven improvements.


Ready to improve your educational content? Start with:

python tinytorch_module_analyzer.py --all