Files
TinyTorch/paper
Vijay Janapa Reddi ccbbda270f Add tier flexibility explanation and fix critical repetitions
TIER FLEXIBILITY ENHANCEMENT:
Strengthened 'Selective implementation' paragraph to explicitly enumerate three
curriculum configurations and explain WHY they matter:

1. Foundation only (M01-07): Introductory ML systems courses, capstone projects
   - Focus on framework internals (tensors, autograd, training loops)

2. Foundation + Architecture (M01-13): Comprehensive ML systems courses
   - Extend to modern deep learning (CNNs, transformers)

3. Optimization focus (M14-19 only): Production ML, edge deployment, TinyML
   - Import pre-built tinytorch.nn/optim, implement only optimization techniques
   - Addresses key limitation: quantization students shouldn't rebuild autograd

Added pedagogical justification:
- Systems-heavy courses build Foundation→Architecture
- Optimization-focused courses skip to production concerns with pre-built deps
- Enables matching curriculum scope to course objectives within semester constraints

CRITICAL REPETITION FIXES (per research coordinator review):

1. Introduction line 307 (systems-first): Removed detailed explanation, added
   forward reference to Section 4 to avoid pre-stating content

2. Introduction line 307 (progressive disclosure): Simplified to brief mention
   with forward reference, removed detailed mechanics

3. Contribution #2 (progressive disclosure): Condensed description, removed
   redundant 'cognitive load challenge' phrase already covered in line 307

These changes follow pattern: Introduction = brief preview + forward reference,
Dedicated sections = full treatment. Eliminates repetition while maintaining flow.

Research coordinator identified 11 repetition categories; addressed 3 critical ones.
Others are either intentional (Adam optimizer, 1958-2024 span as thematic elements)
or acceptable (table vs detailed comparison for MiniTorch).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:37:32 -05:00
..

TinyTorch Research Paper

Complete LaTeX source for the TinyTorch research paper.


Files


Quick Start: Get PDF

  1. Go to Overleaf.com
  2. Create free account
  3. Upload paper.tex and references.bib
  4. Click "Recompile"
  5. Download PDF

Option 2: Local Compilation

./compile_paper.sh

Requires LaTeX installation (MacTeX or BasicTeX).


Paper Details

  • Format: Two-column LaTeX (conference-standard)
  • Length: ~12-15 pages
  • Sections: 7 complete sections
  • Tables: 3 (framework comparison, learning objectives, performance benchmarks)
  • Code listings: 5 (syntax-highlighted Python examples)
  • References: 22 citations

Key Contributions

  1. Progressive disclosure via monkey-patching - Novel pedagogical pattern
  2. Systems-first curriculum design - Memory/FLOPs from Module 01
  3. Historical milestone validation - 70 years of ML as learning modules
  4. Constructionist framework building - Students build complete ML system

Framed as design contribution with empirical validation planned for Fall 2025.


Submission Venues

  • ArXiv - Immediate (establish priority)
  • SIGCSE 2026 - August deadline (may need 6-page condensed version)
  • ICER 2026 - After classroom data (full empirical study)

Ready for submission! Upload to Overleaf to get your PDF.