mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-03-11 22:43:34 -05:00
TIER FLEXIBILITY ENHANCEMENT: Strengthened 'Selective implementation' paragraph to explicitly enumerate three curriculum configurations and explain WHY they matter: 1. Foundation only (M01-07): Introductory ML systems courses, capstone projects - Focus on framework internals (tensors, autograd, training loops) 2. Foundation + Architecture (M01-13): Comprehensive ML systems courses - Extend to modern deep learning (CNNs, transformers) 3. Optimization focus (M14-19 only): Production ML, edge deployment, TinyML - Import pre-built tinytorch.nn/optim, implement only optimization techniques - Addresses key limitation: quantization students shouldn't rebuild autograd Added pedagogical justification: - Systems-heavy courses build Foundation→Architecture - Optimization-focused courses skip to production concerns with pre-built deps - Enables matching curriculum scope to course objectives within semester constraints CRITICAL REPETITION FIXES (per research coordinator review): 1. Introduction line 307 (systems-first): Removed detailed explanation, added forward reference to Section 4 to avoid pre-stating content 2. Introduction line 307 (progressive disclosure): Simplified to brief mention with forward reference, removed detailed mechanics 3. Contribution #2 (progressive disclosure): Condensed description, removed redundant 'cognitive load challenge' phrase already covered in line 307 These changes follow pattern: Introduction = brief preview + forward reference, Dedicated sections = full treatment. Eliminates repetition while maintaining flow. Research coordinator identified 11 repetition categories; addressed 3 critical ones. Others are either intentional (Adam optimizer, 1958-2024 span as thematic elements) or acceptable (table vs detailed comparison for MiniTorch). 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
TinyTorch Research Paper
Complete LaTeX source for the TinyTorch research paper.
Files
- paper.tex - Main paper (~12-15 pages, two-column format)
- references.bib - Bibliography (22 references)
- compile_paper.sh - Build script (requires LaTeX installation)
Quick Start: Get PDF
Option 1: Overleaf (Recommended)
- Go to Overleaf.com
- Create free account
- Upload
paper.texandreferences.bib - Click "Recompile"
- Download PDF
Option 2: Local Compilation
./compile_paper.sh
Requires LaTeX installation (MacTeX or BasicTeX).
Paper Details
- Format: Two-column LaTeX (conference-standard)
- Length: ~12-15 pages
- Sections: 7 complete sections
- Tables: 3 (framework comparison, learning objectives, performance benchmarks)
- Code listings: 5 (syntax-highlighted Python examples)
- References: 22 citations
Key Contributions
- Progressive disclosure via monkey-patching - Novel pedagogical pattern
- Systems-first curriculum design - Memory/FLOPs from Module 01
- Historical milestone validation - 70 years of ML as learning modules
- Constructionist framework building - Students build complete ML system
Framed as design contribution with empirical validation planned for Fall 2025.
Submission Venues
- ArXiv - Immediate (establish priority)
- SIGCSE 2026 - August deadline (may need 6-page condensed version)
- ICER 2026 - After classroom data (full empirical study)
Ready for submission! Upload to Overleaf to get your PDF.