mirror of
https://github.com/harvard-edge/cs249r_book.git
synced 2026-04-29 09:08:54 -05:00
Add new subsection describing function decomposition pattern used within modules. Documents how complex operations (attention, convolution, training) are split into focused helper functions with individual unit tests before composition into exported functions. Updates pedagogical justification to cover both inter-module and intra-module progressive disclosure.
TinyTorch Research Paper
Complete LaTeX source for the TinyTorch research paper.
Files
- paper.tex - Main paper (~12-15 pages, two-column format)
- references.bib - Bibliography (22 references)
- compile_paper.sh - Build script (requires LaTeX installation)
Quick Start: Get PDF
Option 1: Overleaf (Recommended)
- Go to Overleaf.com
- Create free account
- Upload
paper.texandreferences.bib - Click "Recompile"
- Download PDF
Option 2: Local Compilation
./compile_paper.sh
Requires LaTeX installation (MacTeX or BasicTeX).
Paper Details
- Format: Two-column LaTeX (conference-standard)
- Length: ~12-15 pages
- Sections: 7 complete sections
- Tables: 3 (framework comparison, learning objectives, performance benchmarks)
- Code listings: 5 (syntax-highlighted Python examples)
- References: 22 citations
Key Contributions
- Progressive disclosure via monkey-patching - Novel pedagogical pattern
- Systems-first curriculum design - Memory/FLOPs from Module 01
- Historical milestone validation - 70 years of ML as learning modules
- Constructionist framework building - Students build complete ML system
Framed as design contribution with empirical validation planned for Fall 2025.
Submission Venues
- ArXiv - Immediate (establish priority)
- SIGCSE 2026 - August deadline (may need 6-page condensed version)
- ICER 2026 - After classroom data (full empirical study)
Ready for submission! Upload to Overleaf to get your PDF.