mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-03-11 18:53:37 -05:00
Update paper to reference both tinytorch.ai and mlsysbook.ai URLs
Primary URL is tinytorch.ai with mlsysbook.ai/tinytorch as alternate
This commit is contained in:
@@ -1115,7 +1115,7 @@ Three pedagogical contributions enable this transformation. \textbf{Progressive
|
||||
|
||||
\textbf{For educators and bootcamp instructors}: TinyTorch supports flexible integration: self-paced learning requiring zero infrastructure (students run locally on laptops), institutional courses with automated NBGrader assessment, or industry team onboarding for ML engineers transitioning from application development to systems work. The modular structure enables selective adoption: foundation tier only (Modules 01--07, teaching core concepts), architecture focus (adding CNNs and Transformers through Module 13), or complete systems coverage (all 20 modules including optimization and deployment). No GPU access required, no cloud credits needed, no infrastructure barriers.
|
||||
|
||||
The complete codebase, curriculum materials, and assessment infrastructure are openly available at \texttt{tinytorch.ai} under permissive open-source licensing. We invite the global ML education community to adopt TinyTorch in courses, contribute curriculum improvements, translate materials for international accessibility, fork for domain-specific variants (quantum ML, robotics, edge AI), and empirically evaluate whether implementation-based pedagogy achieves its promise. The difference between engineers who know \emph{what} ML systems do and engineers who understand \emph{why} they work begins with understanding what's inside \texttt{loss.backward()}, and TinyTorch makes that understanding accessible to everyone.
|
||||
The complete codebase, curriculum materials, and assessment infrastructure are openly available at \texttt{tinytorch.ai} (or via \texttt{mlsysbook.ai/tinytorch}) under permissive open-source licensing. We invite the global ML education community to adopt TinyTorch in courses, contribute curriculum improvements, translate materials for international accessibility, fork for domain-specific variants (quantum ML, robotics, edge AI), and empirically evaluate whether implementation-based pedagogy achieves its promise. The difference between engineers who know \emph{what} ML systems do and engineers who understand \emph{why} they work begins with understanding what's inside \texttt{loss.backward()}, and TinyTorch makes that understanding accessible to everyone.
|
||||
|
||||
\section*{Acknowledgments}
|
||||
|
||||
|
||||
Reference in New Issue
Block a user