From 3f1021d44862ae909fb6b3cd4bebb9e698e2310e Mon Sep 17 00:00:00 2001 From: Vijay Janapa Reddi Date: Sun, 30 Nov 2025 16:24:29 -0500 Subject: [PATCH] Update paper to reference both tinytorch.ai and mlsysbook.ai URLs Primary URL is tinytorch.ai with mlsysbook.ai/tinytorch as alternate --- paper/paper.tex | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/paper/paper.tex b/paper/paper.tex index e0917188..0ac0193d 100644 --- a/paper/paper.tex +++ b/paper/paper.tex @@ -1115,7 +1115,7 @@ Three pedagogical contributions enable this transformation. \textbf{Progressive \textbf{For educators and bootcamp instructors}: TinyTorch supports flexible integration: self-paced learning requiring zero infrastructure (students run locally on laptops), institutional courses with automated NBGrader assessment, or industry team onboarding for ML engineers transitioning from application development to systems work. The modular structure enables selective adoption: foundation tier only (Modules 01--07, teaching core concepts), architecture focus (adding CNNs and Transformers through Module 13), or complete systems coverage (all 20 modules including optimization and deployment). No GPU access required, no cloud credits needed, no infrastructure barriers. -The complete codebase, curriculum materials, and assessment infrastructure are openly available at \texttt{tinytorch.ai} under permissive open-source licensing. We invite the global ML education community to adopt TinyTorch in courses, contribute curriculum improvements, translate materials for international accessibility, fork for domain-specific variants (quantum ML, robotics, edge AI), and empirically evaluate whether implementation-based pedagogy achieves its promise. The difference between engineers who know \emph{what} ML systems do and engineers who understand \emph{why} they work begins with understanding what's inside \texttt{loss.backward()}, and TinyTorch makes that understanding accessible to everyone. +The complete codebase, curriculum materials, and assessment infrastructure are openly available at \texttt{tinytorch.ai} (or via \texttt{mlsysbook.ai/tinytorch}) under permissive open-source licensing. We invite the global ML education community to adopt TinyTorch in courses, contribute curriculum improvements, translate materials for international accessibility, fork for domain-specific variants (quantum ML, robotics, edge AI), and empirically evaluate whether implementation-based pedagogy achieves its promise. The difference between engineers who know \emph{what} ML systems do and engineers who understand \emph{why} they work begins with understanding what's inside \texttt{loss.backward()}, and TinyTorch makes that understanding accessible to everyone. \section*{Acknowledgments}