Standardize emoji usage across all site pages for professional consistency

- Removed emojis from all section headers (## and ###)
- Reduced emojis in body text and callout boxes
- Standardized link references (removed emoji prefixes)
- Maintained professional tone while keeping content accessible
- Updated quickstart-guide, student-workflow, tito-essentials, faq, datasets, community, resources, testing-framework, learning-progress, checkpoint-system, and all chapter files
This commit is contained in:
Vijay Janapa Reddi
2025-11-12 11:42:03 -05:00
parent f84e7f16ac
commit 29619da811
36 changed files with 253 additions and 231 deletions

View File

@@ -36,7 +36,7 @@ When you implement your own tensor operations, write your own autograd, build yo
---
## 🎯 Core Learning Concepts
## Core Learning Concepts
<div style="background: #f7fafc; border: 1px solid #e2e8f0; padding: 2rem; border-radius: 0.5rem; margin: 2rem 0;">
@@ -143,7 +143,7 @@ output = model(input) # YOU know exactly how this works
## What You'll Achieve: Tier-by-Tier Mastery
### 🏗️ After Foundation Tier (Modules 01-07)
### After Foundation Tier (Modules 01-07)
Build a complete neural network framework from mathematical first principles:
```python
@@ -167,13 +167,13 @@ for batch in dataloader: # Your data management
**🎯 Foundation Achievement**: 95%+ accuracy on MNIST using 100% your own mathematical implementations
### 🏛️ After Architecture Tier (Modules 08-13)
### After Architecture Tier (Modules 08-13)
- **Computer Vision Mastery**: CNNs achieving 75%+ accuracy on CIFAR-10 with YOUR convolution implementations
- **Language Understanding**: Transformers generating coherent text using YOUR attention mechanisms
- **Universal Architecture**: Discover why the SAME mathematical principles work for vision AND language
- **AI Breakthrough Recreation**: Implement the architectures that created the modern AI revolution
### After Optimization Tier (Modules 14-20)
### After Optimization Tier (Modules 14-20)
- **Production Performance**: Systems optimized for <100ms inference latency using YOUR profiling tools
- **Memory Efficiency**: Models compressed to 25% original size with YOUR quantization implementations
- **Hardware Acceleration**: Kernels achieving 10x speedups through YOUR vectorization techniques
@@ -185,20 +185,20 @@ for batch in dataloader: # Your data management
TinyTorch's three-tier structure follows the actual historical progression of machine learning breakthroughs:
### 🏗️ Foundation Era (1980s-1990s) → Foundation Tier
### Foundation Era (1980s-1990s) → Foundation Tier
**The Beginning**: Mathematical foundations that started it all
- **1986 Breakthrough**: Backpropagation enables multi-layer networks
- **Your Implementation**: Build automatic differentiation and gradient-based optimization
- **Historical Milestone**: Train MLPs to 95%+ accuracy on MNIST using YOUR autograd engine
### 🏛️ Architecture Era (1990s-2010s) → Architecture Tier
### Architecture Era (1990s-2010s) → Architecture Tier
**The Revolution**: Specialized architectures for vision and language
- **1998 Breakthrough**: CNNs revolutionize computer vision (LeCun's LeNet)
- **2017 Breakthrough**: Transformers unify vision and language ("Attention is All You Need")
- **Your Implementation**: Build CNNs achieving 75%+ on CIFAR-10, then transformers for text generation
- **Historical Milestone**: Recreate both revolutions using YOUR spatial and attention implementations
### Optimization Era (2010s-Present) → Optimization Tier
### Optimization Era (2010s-Present) → Optimization Tier
**The Engineering**: Production systems that scale to billions of users
- **2020s Breakthrough**: Efficient inference enables real-time LLMs (GPT, ChatGPT)
- **Your Implementation**: Build KV-caching, quantization, and production optimizations
@@ -278,7 +278,7 @@ After each tier, you become the team member who:
---
## 🚀 Start Your Journey
## Start Your Journey
<div style="background: #f8f9fa; padding: 2rem; border-radius: 0.5rem; margin: 2rem 0; text-align: center;">
<h3 style="margin: 0 0 1rem 0; color: #495057;">Begin Building ML Systems</h3>
@@ -288,9 +288,9 @@ After each tier, you become the team member who:
</div>
**Next Steps**:
- **New to TinyTorch**: Start with [Quick Start Guide](../quickstart-guide.html) for immediate hands-on experience
- **Ready to Commit**: Begin [Module 01: Setup](01-setup.html) to configure your development environment
- **Teaching a Course**: Review [Instructor Guide](../usage-paths/classroom-use.html) for classroom integration
- **New to TinyTorch**: Start with [Quick Start Guide](../quickstart-guide.md) for immediate hands-on experience
- **Ready to Commit**: Begin [Module 01: Tensor](../../modules/01_tensor/ABOUT.md) to start building
- **Teaching a Course**: Review [Instructor Guide](../usage-paths/classroom-use.md) for classroom integration
```{admonition} Your Three-Tier Journey Awaits
:class: tip
@@ -303,11 +303,11 @@ By completing all three tiers, you'll have built a complete ML framework that ri
All using code you wrote yourself, from mathematical first principles to production optimization.
```
**📖 Want to understand the pedagogical narrative behind this structure?** See [The Learning Journey](learning-journey.html) to understand WHY modules flow this way and HOW they build on each other through a six-act learning story.
**📖 Want to understand the pedagogical narrative behind this structure?** See [The Learning Journey](learning-journey.md) to understand WHY modules flow this way and HOW they build on each other through a six-act learning story.
---
### 🏗️ FOUNDATION TIER (Modules 01-07)
### Foundation Tier (Modules 01-07)
**Building Blocks of ML Systems • 6-8 weeks • All Prerequisites for Neural Networks**
<div style="background: #f8f9fd; border: 1px solid #e0e7ff; padding: 2rem; border-radius: 0.5rem; margin: 2rem 0;">
@@ -342,7 +342,7 @@ All using code you wrote yourself, from mathematical first principles to product
---
### 🏛️ ARCHITECTURE TIER (Modules 08-13)
### Architecture Tier (Modules 08-13)
**Modern AI Algorithms • 4-6 weeks • Vision + Language Architectures**
<div style="background: #fef7ff; border: 1px solid #f3e8ff; padding: 2rem; border-radius: 0.5rem; margin: 2rem 0;">
@@ -376,7 +376,7 @@ All using code you wrote yourself, from mathematical first principles to product
---
### ⚡ OPTIMIZATION TIER (Modules 14-20)
### Optimization Tier (Modules 14-19)
**Production & Performance • 4-6 weeks • Deploy and Scale ML Systems**
<div style="background: #f0fdfa; border: 1px solid #a7f3d0; padding: 2rem; border-radius: 0.5rem; margin: 2rem 0;">
@@ -411,7 +411,7 @@ All using code you wrote yourself, from mathematical first principles to product
---
## 🎯 Learning Path Recommendations
## Learning Path Recommendations
### Choose Your Learning Style

View File

@@ -7,9 +7,9 @@
## What This Page Is About
This page tells the **pedagogical story** behind TinyTorch's module progression. While other pages explain:
- **WHAT you'll build** ([Three-Tier Structure](00-introduction.html)) - organized module breakdown
- **WHEN in history** ([Milestones](milestones.html)) - recreating ML breakthroughs
- **WHERE you are** ([Progress Tracking](../learning-progress.html)) - capability checkpoints
- **WHAT you'll build** ([Three-Tier Structure](00-introduction.md)) - organized module breakdown
- **WHEN in history** ([Milestones](milestones.md)) - recreating ML breakthroughs
- **WHERE you are** ([Progress Tracking](../learning-progress.md)) - capability checkpoints
This page explains **WHY modules flow this way** - the learning narrative that transforms 20 individual modules into a coherent journey from mathematical foundations to production AI systems.
@@ -312,7 +312,7 @@ As you progress through TinyTorch, you advance along **two dimensions simultaneo
**Understanding Both Dimensions**: The **Acts** explain WHY you're building each component (pedagogical progression). The **Milestones** prove WHAT you've built actually works (historical validation).
**📖 See [Journey Through ML History](milestones.html)** for complete milestone details and how to run them.
**📖 See [Journey Through ML History](milestones.md)** for complete milestone details and how to run them.
---
@@ -346,7 +346,7 @@ The learning journey also maps to **21 capability checkpoints** you can track:
- Checkpoint 19: Competitive benchmarking ✓
- Checkpoint 20: Complete systems ✓
**📖 See [Progress Tracking](../learning-progress.html)** to monitor your capability development.
**📖 See [Progress Tracking](../learning-progress.md)** to monitor your capability development.
---
@@ -527,7 +527,7 @@ Typical time estimates (varies by background):
- Act V → Systems (2024)
- Act VI → TinyGPT (complete)
**📖 See [Milestones](milestones.html)** for details.
**📖 See [Milestones](milestones.md)** for details.
---
@@ -543,10 +543,10 @@ Typical time estimates (varies by background):
</div>
**Related Resources**:
- **[Three-Tier Structure](00-introduction.html)** - Organized module breakdown with time estimates
- **[Journey Through ML History](milestones.html)** - Historical milestones you'll recreate
- **[Progress Tracking](../learning-progress.html)** - Monitor your capability development
- **[Quick Start Guide](../quickstart-guide.html)** - Hands-on setup and first module
- **[Three-Tier Structure](00-introduction.md)** - Organized module breakdown with time estimates
- **[Journey Through ML History](milestones.md)** - Historical milestones you'll recreate
- **[Progress Tracking](../learning-progress.md)** - Monitor your capability development
- **[Quick Start Guide](../quickstart-guide.md)** - Hands-on setup and first module
---

View File

@@ -1,23 +1,23 @@
# 🏆 Journey Through ML History
# Journey Through ML History
**Experience the evolution of AI by rebuilding history's most important breakthroughs with YOUR TinyTorch implementations.**
---
## 🎯 What Are Milestones?
## What Are Milestones?
Milestones are **proof-of-mastery demonstrations** that showcase what you can build after completing specific modules. Each milestone recreates a historically significant ML achievement using YOUR implementations.
### Why This Approach?
- 🧠 **Deep Understanding**: Experience the actual challenges researchers faced
- 📈 **Progressive Learning**: Each milestone builds on previous foundations
- 🏆 **Real Achievements**: Not toy examples - these are historically significant breakthroughs
- 🔧 **Systems Thinking**: Understand WHY each innovation mattered for ML systems
- **Deep Understanding**: Experience the actual challenges researchers faced
- **Progressive Learning**: Each milestone builds on previous foundations
- **Real Achievements**: Not toy examples - these are historically significant breakthroughs
- **Systems Thinking**: Understand WHY each innovation mattered for ML systems
---
## 🎯 Two Dimensions of Your Progress
## Two Dimensions of Your Progress
As you build TinyTorch, you're progressing along **TWO dimensions simultaneously**:
@@ -30,7 +30,7 @@ As you build TinyTorch, you're progressing along **TWO dimensions simultaneously
**Act V (14-19)**: Production systems - optimization and deployment
**Act VI (20)**: Complete integration - unified AI systems
**📖 See [The Learning Journey](learning-journey.html)** for the complete pedagogical narrative explaining WHY modules flow this way.
See [The Learning Journey](learning-journey.md) for the complete pedagogical narrative explaining WHY modules flow this way.
### Historical Dimension (Milestones): What You CAN Build
@@ -45,20 +45,20 @@ As you build TinyTorch, you're progressing along **TWO dimensions simultaneously
| Learning Act | Unlocked Milestone | Proof of Mastery |
|--------------|-------------------|------------------|
| **Act I: Foundation (01-04)** | 🧠 1957 Perceptron | Your Linear layer recreates history |
| **Act II: Learning (05-07)** | 1969 XOR + 🔢 1986 MLP | Your autograd enables training (95%+ MNIST) |
| **Act III: Data & Scale (08-09)** | 🖼️ 1998 CNN | Your Conv2d achieves 75%+ on CIFAR-10 |
| **Act IV: Language (10-13)** | 🤖 2017 Transformers | Your attention generates coherent text |
| **Act V: Production (14-18)** | 2018 MLPerf | Your optimizations achieve production speed |
| **Act VI: Integration (19-20)** | 🏆 Benchmarking + Capstone | Your complete framework competes |
| **Act I: Foundation (01-04)** | 1957 Perceptron | Your Linear layer recreates history |
| **Act II: Learning (05-07)** | 1969 XOR + 1986 MLP | Your autograd enables training (95%+ MNIST) |
| **Act III: Data & Scale (08-09)** | 1998 CNN | Your Conv2d achieves 75%+ on CIFAR-10 |
| **Act IV: Language (10-13)** | 2017 Transformers | Your attention generates coherent text |
| **Act V: Production (14-18)** | 2018 MLPerf | Your optimizations achieve production speed |
| **Act VI: Integration (19-20)** | Benchmarking + Capstone | Your complete framework competes |
**Understanding Both Dimensions**: The **Acts** explain WHY you're building each component (pedagogical progression). The **Milestones** prove WHAT you've built works (historical validation). Together, they show you're not just completing exercises - you're building something real.
---
## 📅 The Timeline
## The Timeline
### 🧠 01. Perceptron (1957) - Rosenblatt
### 01. Perceptron (1957) - Rosenblatt
**After Modules 02-04**
@@ -88,7 +88,7 @@ python 02_rosenblatt_trained.py # See the solution (trained)
---
### 02. XOR Crisis (1969) - Minsky & Papert
### 02. XOR Crisis (1969) - Minsky & Papert
**After Modules 02-06**
@@ -118,7 +118,7 @@ python 02_xor_solved.py # Hidden layers solve it!
---
### 🔢 03. MLP Revival (1986) - Backpropagation Era
### 03. MLP Revival (1986) - Backpropagation Era
**After Modules 02-08**
@@ -148,7 +148,7 @@ python 02_rumelhart_mnist.py # Full MNIST
---
### 🖼️ 04. CNN Revolution (1998) - LeCun's Breakthrough
### 04. CNN Revolution (1998) - LeCun's Breakthrough
**After Modules 02-09****🎯 North Star Achievement**
@@ -178,7 +178,7 @@ python 02_lecun_cifar10.py # CIFAR-10 @ 75%+ accuracy
---
### 🤖 05. Transformer Era (2017) - Attention Revolution
### 05. Transformer Era (2017) - Attention Revolution
**After Modules 02-13**
@@ -208,7 +208,7 @@ python 02_vaswani_dialogue.py # Multi-turn dialogue
---
### 06. MLPerf Era (2018) - The Optimization Revolution
### 06. MLPerf Era (2018) - The Optimization Revolution
**After Modules 14-18**
@@ -239,7 +239,7 @@ python 03_generation_opts.py # Speed up inference (cache + batch)
---
## 🎓 Learning Philosophy
## Learning Philosophy
### Progressive Capability Building
@@ -263,7 +263,7 @@ Each milestone teaches critical systems thinking:
---
## 🚀 How to Use Milestones
## How to Use Milestones
### 1. Complete Prerequisites
@@ -302,7 +302,7 @@ Each milestone includes:
---
## 🎯 Quick Reference
## Quick Reference
### Milestone Prerequisites
@@ -324,7 +324,7 @@ Each milestone includes:
---
## 📚 Further Learning
## Further Learning
After completing milestones, explore:
@@ -335,7 +335,7 @@ After completing milestones, explore:
---
## 🌟 Why This Matters
## Why This Matters
**Most courses teach you to USE frameworks.**
**TinyTorch teaches you to UNDERSTAND them.**

View File

@@ -1,4 +1,4 @@
# 🎯 TinyTorch Checkpoint System
# TinyTorch Checkpoint System
<div style="background: #fff3cd; border: 1px solid #ffc107; padding: 1.5rem; border-radius: 0.5rem; margin: 2rem 0;">
<h3 style="margin: 0 0 0.5rem 0; color: #856404;">📋 Optional Progress Tracking</h3>
@@ -42,9 +42,9 @@ The TinyTorch checkpoint system provides optional infrastructure for capability
---
## 🚀 The Five Major Checkpoints
## The Five Major Checkpoints
### 🎯 Foundation
### Foundation
*Core ML primitives and environment setup*
**Modules**: Setup • Tensors • Activations
@@ -152,7 +152,7 @@ Every checkpoint completion unlocks a concrete capability:
The checkpoint system provides comprehensive progress tracking and capability validation through automated testing infrastructure.
**📖 See [Essential Commands](tito-essentials.html)** for complete command reference and usage examples.
**📖 See [Essential Commands](tito-essentials.md)** for complete command reference and usage examples.
### Integration with Development
The checkpoint system connects directly to your actual development work:
@@ -248,7 +248,7 @@ The checkpoint progression **Foundation → Architecture → Training → Infere
- **Problem**: Modules don't work together due to missing dependencies
- **Solution**: Verify prerequisite capabilities before testing advanced features
**📖 See [Essential Commands](tito-essentials.html)** for complete debugging command reference.
**📖 See [Essential Commands](tito-essentials.md)** for complete debugging command reference.
### Checkpoint Test Structure
@@ -299,4 +299,4 @@ print("🏆 Foundation checkpoint PASSED")
- Analyze memory usage during testing
- Identify bottlenecks in capability validation
**📖 See [Essential Commands](tito-essentials.html)** for complete command reference and advanced usage examples.
**📖 See [Essential Commands](tito-essentials.md)** for complete command reference and advanced usage examples.

View File

@@ -1,42 +1,42 @@
# 🌍 Community Ecosystem
# Community Ecosystem
**Building Together**
---
## 🎯 Overview
## Overview
TinyTorch is more than just a course—it's a growing community of students, educators, and ML engineers learning systems engineering from first principles.
---
## 📊 Community Platform (Coming Soon)
## Community Platform (Coming Soon)
<div style="background: #e3f2fd; border: 2px solid #2196f3; padding: 1.5rem; border-radius: 0.5rem; margin: 2rem 0;">
<h3 style="margin: 0 0 1rem 0; color: #1565c0;">🚧 Building Community Features</h3>
<h3 style="margin: 0 0 1rem 0; color: #1565c0;">Building Community Features</h3>
<p style="margin: 0; color: #1565c0;">We're creating live community features including activity dashboards, study partner matching, and real-time progress tracking. Stay tuned!</p>
</div>
### Planned Features
**📊 Live Dashboard**
**Live Dashboard**
- Real-time community activity
- Global learning progress
- Module completion stats
**🤝 Connection Hub**
**Connection Hub**
- Find study partners
- Join study groups
- Connect with peers
**🌍 Global Reach**
**Global Reach**
- See who's learning worldwide
- Geographic distribution
- Community milestones
---
## 🚀 Get Involved Now
## Get Involved Now
**Learn Together**
- Ask questions in [GitHub Discussions](https://github.com/harvard-edge/TinyTorch/discussions)
@@ -55,4 +55,4 @@ TinyTorch is more than just a course—it's a growing community of students, edu
---
**Build ML systems. Learn together. Grow the community.** 🌍
**Build ML systems. Learn together. Grow the community.**

View File

@@ -14,7 +14,7 @@ TinyTorch uses a two-tier dataset approach:
<div style="display: grid; grid-template-columns: 1fr 1fr; gap: 1.5rem; margin: 2rem 0;">
<div style="background: #e3f2fd; border: 1px solid #2196f3; padding: 1.5rem; border-radius: 0.5rem;">
<h3 style="margin: 0 0 1rem 0; color: #1976d2;">📦 Shipped Datasets</h3>
<h3 style="margin: 0 0 1rem 0; color: #1976d2;">Shipped Datasets</h3>
<p style="margin: 0 0 1rem 0;"><strong>~350 KB total - Ships with repository</strong></p>
<ul style="margin: 0; font-size: 0.9rem;">
<li>Small enough to fit in Git (~1K samples each)</li>
@@ -26,7 +26,7 @@ TinyTorch uses a two-tier dataset approach:
</div>
<div style="background: #f3e5f5; border: 1px solid #9c27b0; padding: 1.5rem; border-radius: 0.5rem;">
<h3 style="margin: 0 0 1rem 0; color: #7b1fa2;">⬇️ Downloaded Datasets</h3>
<h3 style="margin: 0 0 1rem 0; color: #7b1fa2;">Downloaded Datasets</h3>
<p style="margin: 0 0 1rem 0;"><strong>~180 MB - Auto-downloaded when needed</strong></p>
<ul style="margin: 0; font-size: 0.9rem;">
<li>Standard ML benchmarks (MNIST, CIFAR-10)</li>
@@ -49,9 +49,9 @@ TinyTorch uses a two-tier dataset approach:
<div style="background: #fff5f5; border-left: 4px solid #e74c3c; padding: 1.5rem; margin: 1.5rem 0;">
**📍 Location**: `datasets/tinydigits/`
**📊 Size**: ~310 KB
**🎯 Used by**: Milestones 03 & 04 (MLP and CNN examples)
**Location**: `datasets/tinydigits/`
**Size**: ~310 KB
**Used by**: Milestones 03 & 04 (MLP and CNN examples)
**Contents:**
- 1,000 training samples
@@ -82,9 +82,9 @@ X_train, y_train, X_test, y_test = load_tinydigits()
<div style="background: #f0fff4; border-left: 4px solid #22c55e; padding: 1.5rem; margin: 1.5rem 0;">
**📍 Location**: `datasets/tinytalks/`
**📊 Size**: ~40 KB
**🎯 Used by**: Milestone 05 (Transformer/GPT text generation)
**Location**: `datasets/tinytalks/`
**Size**: ~40 KB
**Used by**: Milestone 05 (Transformer/GPT text generation)
**Contents:**
- 350 Q&A pairs across 5 difficulty levels
@@ -117,7 +117,7 @@ dataset = load_tinytalks()
# Returns list of (question, answer) pairs
```
**📖 See detailed documentation:** `datasets/tinytalks/README.md`
See detailed documentation: `datasets/tinytalks/README.md`
</div>
@@ -131,9 +131,9 @@ These standard benchmarks download automatically when you run relevant milestone
<div style="background: #fffbeb; border-left: 4px solid #f59e0b; padding: 1.5rem; margin: 1.5rem 0;">
**📍 Downloads to**: `milestones/datasets/mnist/`
**📊 Size**: ~10 MB (compressed)
**🎯 Used by**: `milestones/03_1986_mlp/02_rumelhart_mnist.py`
**Downloads to**: `milestones/datasets/mnist/`
**Size**: ~10 MB (compressed)
**Used by**: `milestones/03_1986_mlp/02_rumelhart_mnist.py`
**Contents:**
- 60,000 training samples
@@ -157,9 +157,9 @@ These standard benchmarks download automatically when you run relevant milestone
<div style="background: #fdf2f8; border-left: 4px solid #ec4899; padding: 1.5rem; margin: 1.5rem 0;">
**📍 Downloads to**: `milestones/datasets/cifar-10/`
**📊 Size**: ~170 MB (compressed)
**🎯 Used by**: `milestones/04_1998_cnn/02_lecun_cifar10.py`
**Downloads to**: `milestones/datasets/cifar-10/`
**Size**: ~170 MB (compressed)
**Used by**: `milestones/04_1998_cnn/02_lecun_cifar10.py`
**Contents:**
- 50,000 training samples
@@ -186,28 +186,28 @@ These standard benchmarks download automatically when you run relevant milestone
### Why These Specific Datasets?
**TinyDigits (not full MNIST):**
- 100× faster training iterations
- Ships with repo (no download)
- Same conceptual challenges
- Perfect for learning and debugging
- 100× faster training iterations
- Ships with repo (no download)
- Same conceptual challenges
- Perfect for learning and debugging
**TinyTalks (custom dataset):**
- Designed for educational progression
- Scaffolded difficulty levels
- Character-level tokenization friendly
- Engaging conversational format
- Designed for educational progression
- Scaffolded difficulty levels
- Character-level tokenization friendly
- Engaging conversational format
**MNIST (when scaling up):**
- Industry standard benchmark
- Validates your implementation
- Comparable to published results
- 95%+ accuracy is achievable milestone
- Industry standard benchmark
- Validates your implementation
- Comparable to published results
- 95%+ accuracy is achievable milestone
**CIFAR-10 (for CNN validation):**
- Natural images (harder than digits)
- RGB channels (multi-dimensional)
- Standard CNN benchmark
- 75%+ with basic CNN proves it works
- Natural images (harder than digits)
- RGB channels (multi-dimensional)
- Standard CNN benchmark
- 75%+ with basic CNN proves it works
---
@@ -249,12 +249,12 @@ conversations = load_tinytalks()
| Dataset | Size | Samples | Ships With Repo | Purpose |
|---------|------|---------|-----------------|---------|
| TinyDigits | 310 KB | 1,200 | Yes | Fast MLP/CNN iteration |
| TinyTalks | 40 KB | 350 pairs | Yes | Transformer learning |
| MNIST | 10 MB | 70,000 | Downloads | MLP validation |
| CIFAR-10 | 170 MB | 60,000 | Downloads | CNN validation |
| TinyDigits | 310 KB | 1,200 | Yes | Fast MLP/CNN iteration |
| TinyTalks | 40 KB | 350 pairs | Yes | Transformer learning |
| MNIST | 10 MB | 70,000 | Downloads | MLP validation |
| CIFAR-10 | 170 MB | 60,000 | Downloads | CNN validation |
**Total shipped**: ~350 KB
**Total shipped**: ~350 KB
**Total with benchmarks**: ~180 MB
---
@@ -283,27 +283,27 @@ conversations = load_tinytalks()
## Frequently Asked Questions
**Q: Why not use full MNIST from the start?**
**Q: Why not use full MNIST from the start?**
A: TinyDigits trains 100× faster, enabling rapid iteration during learning. MNIST validates your complete implementation later.
**Q: Can I use my own datasets?**
**Q: Can I use my own datasets?**
A: Absolutely! TinyTorch is a real framework—add your data loading code just like PyTorch.
**Q: Why ship datasets in Git?**
**Q: Why ship datasets in Git?**
A: 350 KB is negligible (smaller than many images), and it enables offline learning with instant iteration.
**Q: Where does CIFAR-10 download from?**
**Q: Where does CIFAR-10 download from?**
A: Official sources via `milestones/data_manager.py`, with integrity verification.
**Q: Can I skip the large downloads?**
**Q: Can I skip the large downloads?**
A: Yes! You can work through most milestones using only shipped datasets. Downloaded datasets are for validation milestones.
---
## Related Documentation
- **📖 [Milestones Guide](chapters/milestones.html)** - See how each dataset is used in historical achievements
- **📖 [Student Workflow](student-workflow.html)** - Learn the development cycle
- **📖 [Quick Start](quickstart-guide.html)** - Start building in 15 minutes
- [Milestones Guide](chapters/milestones.md) - See how each dataset is used in historical achievements
- [Student Workflow](student-workflow.md) - Learn the development cycle
- [Quick Start](quickstart-guide.md) - Start building in 15 minutes
**Dataset implementation details**: See `datasets/tinydigits/README.md` and `datasets/tinytalks/README.md` for technical specifications.

View File

@@ -212,7 +212,7 @@ Milestones are historical ML achievements you recreate with YOUR implementations
Each milestone proves your framework works by running actual ML experiments.
**📖 See [Journey Through ML History](chapters/milestones.html)** for details.
**📖 See [Journey Through ML History](chapters/milestones.md)** for details.
### Are the checkpoints required?
@@ -228,7 +228,7 @@ Each milestone proves your framework works by running actual ML experiments.
- Helpful for self-assessment
- Use `tito checkpoint status` to view progress
**📖 See [Student Workflow](student-workflow.html)** for the core development cycle.
**📖 See [Student Workflow](student-workflow.md)** for the core development cycle.
---
@@ -255,7 +255,7 @@ cd modules/01_tensor
jupyter lab tensor_dev.py
```
**📖 See [Quick Start Guide](quickstart-guide.html)** for detailed setup.
**📖 See [Quick Start Guide](quickstart-guide.md)** for detailed setup.
### What's the typical workflow?
@@ -272,7 +272,7 @@ cd ../../milestones/01_1957_perceptron
python rosenblatt_forward.py # Uses YOUR implementation!
```
**📖 See [Student Workflow](student-workflow.html)** for complete details.
**📖 See [Student Workflow](student-workflow.md)** for complete details.
### Can I use this in my classroom?
@@ -283,7 +283,7 @@ python rosenblatt_forward.py # Uses YOUR implementation!
- NBGrader integration coming soon for automated grading
- Instructor tooling under development
**📖 See [Classroom Use Guide](usage-paths/classroom-use.html)** for details.
**📖 See [Classroom Use Guide](usage-paths/classroom-use.md)** for details.
### How do I get help?

View File

@@ -342,7 +342,7 @@ graph LR
**The essential three-step cycle**: Edit → Export → Validate
**📖 See [Student Workflow](student-workflow.html)** for detailed workflow guide.
**📖 See [Student Workflow](student-workflow.md)** for detailed workflow guide.
---
@@ -382,7 +382,7 @@ flowchart TB
**Strategy**: Start small (shipped datasets), iterate fast, then validate on benchmarks (downloaded datasets).
**📖 See [Datasets Guide](datasets.html)** for complete dataset documentation.
**📖 See [Datasets Guide](datasets.md)** for complete dataset documentation.
---
@@ -484,8 +484,8 @@ xychart-beta
## Related Pages
- **📖 [Introduction](intro.html)** - What is TinyTorch and why build from scratch
- **📖 [Student Workflow](student-workflow.html)** - The essential edit → export → validate cycle
- **📖 [Three-Tier Structure](chapters/00-introduction.html)** - Detailed tier breakdown
- **📖 [Milestones](chapters/milestones.html)** - Journey through ML history
- **📖 [FAQ](faq.html)** - Common questions answered
- **📖 [Introduction](intro.md)** - What is TinyTorch and why build from scratch
- **📖 [Student Workflow](student-workflow.md)** - The essential edit → export → validate cycle
- **📖 [Three-Tier Structure](chapters/00-introduction.md)** - Detailed tier breakdown
- **📖 [Milestones](chapters/milestones.md)** - Journey through ML history
- **📖 [FAQ](faq.md)** - Common questions answered

View File

@@ -11,7 +11,7 @@
TinyTorch follows a simple three-step cycle: **Edit modules → Export to package → Validate with milestones**
**📖 See [Student Workflow](student-workflow.html)** for the complete development cycle, best practices, and troubleshooting.
See [Student Workflow](student-workflow.md) for the complete development cycle, best practices, and troubleshooting.
## Understanding Modules vs Checkpoints vs Milestones
@@ -35,7 +35,7 @@ TinyTorch follows a simple three-step cycle: **Edit modules → Export to packag
- Tracks capability mastery
- Not required for the core workflow
**📖 See [Journey Through ML History](chapters/milestones.html)** for milestone details.
See [Journey Through ML History](chapters/milestones.md) for milestone details.
</div>
@@ -43,7 +43,7 @@ TinyTorch follows a simple three-step cycle: **Edit modules → Export to packag
TinyTorch organizes 20 modules through three pedagogically-motivated tiers: **Foundation** (build mathematical infrastructure), **Architecture** (implement modern AI), and **Optimization** (deploy production systems).
**📖 See [Three-Tier Learning Structure](chapters/00-introduction.html#three-tier-learning-pathway-build-complete-ml-systems)** for complete tier breakdown, detailed module descriptions, time estimates, and learning outcomes.
See [Three-Tier Learning Structure](chapters/00-introduction.md) for complete tier breakdown, detailed module descriptions, time estimates, and learning outcomes.
## Module Progression Checklist
@@ -70,7 +70,7 @@ Track your journey through the 20 modules:
- [ ] **Module 19**: Benchmarking - MLPerf-style comparison
- [ ] **Module 20**: Competition - Capstone challenge
**📖 See [Quick Start Guide](quickstart-guide.html)** for immediate hands-on experience with your first module.
**📖 See [Quick Start Guide](quickstart-guide.md)** for immediate hands-on experience with your first module.
## Optional: Checkpoint System
@@ -82,7 +82,7 @@ tito checkpoint status # View your progress
This provides 21 capability checkpoints corresponding to modules and validates your understanding. Helpful for self-assessment but **not required** for the core workflow.
**📖 See [Essential Commands](tito-essentials.html)** for checkpoint commands.
**📖 See [Essential Commands](tito-essentials.md)** for checkpoint commands.
---
@@ -149,6 +149,6 @@ python 01_rosenblatt_forward.py # Uses YOUR implementation!
**Optional**: Use `tito checkpoint status` to see capability tracking
**📖 See [Student Workflow](student-workflow.html)** for the complete development cycle.
**📖 See [Student Workflow](student-workflow.md)** for the complete development cycle.
**Approach**: You're building ML systems engineering capabilities through hands-on implementation. Each module adds new functionality to your framework, and milestones prove it works.

View File

@@ -0,0 +1 @@
../../modules/01_tensor/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/02_activations/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/03_layers/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/04_losses/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/05_autograd/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/06_optimizers/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/07_training/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/08_dataloader/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/09_spatial/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/10_tokenization/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/11_embeddings/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/12_attention/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/13_transformers/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/14_profiling/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/15_quantization/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/16_compression/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/17_memoization/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/18_acceleration/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/19_benchmarking/ABOUT.md

View File

@@ -0,0 +1 @@
../../modules/20_capstone/ABOUT.md

View File

@@ -7,7 +7,7 @@
**Purpose**: Get hands-on experience building ML systems in 15 minutes. Complete setup verification and build your first neural network component from scratch.
## 2-Minute Setup
## 2-Minute Setup
Let's get you ready to build ML systems:
@@ -27,12 +27,12 @@ source activate.sh
```
**What this does:**
- Creates optimized virtual environment (arm64 on Apple Silicon)
- Installs all dependencies (NumPy, Jupyter, Rich, PyTorch for validation)
- Configures TinyTorch in development mode
- Verifies installation
- Creates optimized virtual environment (arm64 on Apple Silicon)
- Installs all dependencies (NumPy, Jupyter, Rich, PyTorch for validation)
- Configures TinyTorch in development mode
- Verifies installation
**📖 See [Essential Commands](tito-essentials.html)** for detailed workflow and troubleshooting.
See [Essential Commands](tito-essentials.md) for detailed workflow and troubleshooting.
</div>
@@ -44,13 +44,13 @@ source activate.sh
tito system doctor
```
You should see all green checkmarks! This confirms your environment is ready for hands-on ML systems building.
You should see all green checkmarks. This confirms your environment is ready for hands-on ML systems building.
**📖 See [Essential Commands](tito-essentials.html)** for verification commands and troubleshooting.
See [Essential Commands](tito-essentials.md) for verification commands and troubleshooting.
</div>
## 🏗️ 15-Minute First Module Walkthrough
## 15-Minute First Module Walkthrough
Let's build your first neural network component following the **TinyTorch workflow**:
@@ -58,17 +58,17 @@ Let's build your first neural network component following the **TinyTorch workfl
1. Edit modules → 2. Export to package → 3. Validate with milestones
```
**📖 See [Student Workflow](student-workflow.html)** for the complete development cycle.
See [Student Workflow](student-workflow.md) for the complete development cycle.
### Module 01: Tensor Foundations
<div style="background: #fffbeb; padding: 1.5rem; border-radius: 0.5rem; border-left: 4px solid #f59e0b; margin: 1.5rem 0;">
**🎯 Learning Goal:** Build N-dimensional arrays - the foundation of all neural networks
**Learning Goal:** Build N-dimensional arrays - the foundation of all neural networks
**⏱️ Time:** 15 minutes
**Time:** 15 minutes
**💻 Action:** Start with Module 01 to build tensor operations from scratch.
**Action:** Start with Module 01 to build tensor operations from scratch.
```bash
# Step 1: Edit the module source
@@ -91,9 +91,9 @@ tito module complete 01
This makes your implementation importable: `from tinytorch import Tensor`
**📖 See [Student Workflow](student-workflow.html)** for the complete edit → export → validate cycle.
See [Student Workflow](student-workflow.md) for the complete edit → export → validate cycle.
**Achievement Unlocked:** Foundation capability - "Can I create and manipulate the building blocks of ML?"
**Achievement Unlocked:** Foundation capability - "Can I create and manipulate the building blocks of ML?"
</div>
@@ -101,11 +101,11 @@ This makes your implementation importable: `from tinytorch import Tensor`
<div style="background: #fdf2f8; padding: 1.5rem; border-radius: 0.5rem; border-left: 4px solid #ec4899; margin: 1.5rem 0;">
**🎯 Learning Goal:** Add nonlinearity - the key to neural network intelligence
**Learning Goal:** Add nonlinearity - the key to neural network intelligence
**⏱️ Time:** 10 minutes
**Time:** 10 minutes
**💻 Action:** Continue with Module 02 to add activation functions.
**Action:** Continue with Module 02 to add activation functions.
```bash
# Step 1: Edit the module
@@ -126,13 +126,13 @@ You'll implement essential activation functions:
tito module complete 02
```
**📖 See [Student Workflow](student-workflow.html)** for the complete edit → export → validate cycle.
See [Student Workflow](student-workflow.md) for the complete edit → export → validate cycle.
**Achievement Unlocked:** Intelligence capability - "Can I add nonlinearity to enable learning?"
**Achievement Unlocked:** Intelligence capability - "Can I add nonlinearity to enable learning?"
</div>
## 📊 Track Your Progress
## Track Your Progress
After completing your first modules:
@@ -146,73 +146,73 @@ tito checkpoint status # View your completion tracking
This is helpful for self-assessment but not required for the core workflow.
**📖 See [Student Workflow](student-workflow.html)** for the essential edit → export → validate cycle, and [Track Your Progress](learning-progress.html)** for detailed capability tracking.
See [Student Workflow](student-workflow.md) for the essential edit → export → validate cycle, and [Track Your Progress](learning-progress.md) for detailed capability tracking.
</div>
## 🏆 Validate with Historical Milestones
## Validate with Historical Milestones
After exporting your modules, **prove what you've built** by running milestone scripts:
<div style="background: linear-gradient(135deg, #667eea 0%, #764ba2 100%); padding: 2rem; border-radius: 0.5rem; margin: 1.5rem 0; color: white;">
**After Module 07**: Build **Rosenblatt's 1957 Perceptron** - the first trainable neural network
**After Module 07**: Solve the **1969 XOR Crisis** with multi-layer networks
**After Module 08**: Achieve **95%+ accuracy on MNIST** with 1986 backpropagation
**After Module 09**: Hit **75%+ on CIFAR-10** with 1998 CNNs
**After Module 13**: Generate text with **2017 Transformers**
**After Module 07**: Build **Rosenblatt's 1957 Perceptron** - the first trainable neural network
**After Module 07**: Solve the **1969 XOR Crisis** with multi-layer networks
**After Module 08**: Achieve **95%+ accuracy on MNIST** with 1986 backpropagation
**After Module 09**: Hit **75%+ on CIFAR-10** with 1998 CNNs
**After Module 13**: Generate text with **2017 Transformers**
**After Module 18**: Optimize for production with **2018 MLPerf**
**📖 See [Journey Through ML History](chapters/milestones.html)** for complete timeline, requirements, and expected results.
See [Journey Through ML History](chapters/milestones.md) for complete timeline, requirements, and expected results.
</div>
**The Workflow**: Edit modules → Export with `tito module complete N` → Run milestone scripts to validate
**📖 See [Student Workflow](student-workflow.html)** for the complete cycle.
See [Student Workflow](student-workflow.md) for the complete cycle.
## 🎯 What You Just Accomplished
## What You Just Accomplished
In 15 minutes, you've:
<div style="display: grid; grid-template-columns: repeat(auto-fit, minmax(250px, 1fr)); gap: 1rem; margin: 2rem 0;">
<div style="background: #e6fffa; padding: 1rem; border-radius: 0.5rem; border-left: 3px solid #26d0ce;">
<h4 style="margin: 0 0 0.5rem 0; color: #0d9488;">🔧 Setup Complete</h4>
<h4 style="margin: 0 0 0.5rem 0; color: #0d9488;">Setup Complete</h4>
<p style="margin: 0; font-size: 0.9rem;">Installed TinyTorch and verified your environment</p>
</div>
<div style="background: #f0f9ff; padding: 1rem; border-radius: 0.5rem; border-left: 3px solid #3b82f6;">
<h4 style="margin: 0 0 0.5rem 0; color: #1d4ed8;">🧱 Created Foundation</h4>
<h4 style="margin: 0 0 0.5rem 0; color: #1d4ed8;">Created Foundation</h4>
<p style="margin: 0; font-size: 0.9rem;">Implemented core tensor operations from scratch</p>
</div>
<div style="background: #fefce8; padding: 1rem; border-radius: 0.5rem; border-left: 3px solid #eab308;">
<h4 style="margin: 0 0 0.5rem 0; color: #a16207;">🏆 First Capability</h4>
<h4 style="margin: 0 0 0.5rem 0; color: #a16207;">First Capability</h4>
<p style="margin: 0; font-size: 0.9rem;">Earned your first ML systems capability checkpoint</p>
</div>
</div>
## 🚀 Your Next Steps
## Your Next Steps
<div style="background: #f8f9fa; padding: 2rem; border-radius: 0.5rem; margin: 2rem 0;">
### Immediate Next Actions (Choose One):
**🔥 Continue Building (Recommended):** Begin Module 03 to add layers to your network.
**Continue Building (Recommended):** Begin Module 03 to add layers to your network.
**📚 Master the Workflow:**
- **📖 See [Student Workflow](student-workflow.html)** for the complete edit → export → validate cycle
- **📖 See [Essential Commands](tito-essentials.html)** for complete TITO command reference
- **📖 See [Track Your Progress](learning-progress.html)** for the full learning path
**Master the Workflow:**
- See [Student Workflow](student-workflow.md) for the complete edit → export → validate cycle
- See [Essential Commands](tito-essentials.md) for complete TITO command reference
- See [Track Your Progress](learning-progress.md) for the full learning path
**🎓 For Instructors:**
- **📖 See [Classroom Setup Guide](usage-paths/classroom-use.html)** for NBGrader integration (coming soon)
**For Instructors:**
- See [Classroom Setup Guide](usage-paths/classroom-use.md) for NBGrader integration (coming soon)
</div>
## 💡 Pro Tips for Continued Success
## Pro Tips for Continued Success
<div style="background: #fff5f5; padding: 1.5rem; border: 1px solid #fed7d7; border-radius: 0.5rem; margin: 1rem 0;">
@@ -221,11 +221,11 @@ In 15 minutes, you've:
2. Export with `tito module complete N`
3. Validate by running milestone scripts
**📖 See [Student Workflow](student-workflow.html)** for detailed workflow guide and best practices.
See [Student Workflow](student-workflow.md) for detailed workflow guide and best practices.
</div>
## 🌟 You're Now a TinyTorch Builder!
## You're Now a TinyTorch Builder
<div style="background: #f8f9fa; padding: 2rem; border-radius: 0.5rem; margin: 2rem 0; text-align: center;">
<h3 style="margin: 0 0 1rem 0; color: #495057;">Ready to Build Production ML Systems</h3>
@@ -238,4 +238,4 @@ In 15 minutes, you've:
**What makes TinyTorch different:** You're not just learning *about* neural networks—you're building them from fundamental mathematical operations. Every line of code you write builds toward complete ML systems mastery.
**Next milestone:** After Module 08, you'll train real neural networks on actual datasets using 100% your own code!
**Next milestone:** After Module 08, you'll train real neural networks on actual datasets using 100% your own code!

0
site/references.bib Normal file
View File

View File

@@ -1,4 +1,4 @@
# 📚 Additional Learning Resources
# Additional Learning Resources
<div style="background: #f8f9fa; border: 1px solid #dee2e6; padding: 2rem; border-radius: 0.5rem; text-align: center; margin: 2rem 0;">
<h2 style="margin: 0 0 1rem 0; color: #495057;">Complement Your TinyTorch Journey</h2>
@@ -8,14 +8,13 @@
While TinyTorch teaches you to build complete ML systems from scratch, these resources provide broader context, alternative perspectives, and production tools.
**TinyTorch Learning Resources:**
- **📖 See [Track Your Progress](learning-progress.html)** for monitoring capability development and learning progression
- **📖 See [Progress Tracking](checkpoint-system.html)** for technical details on capability testing
- **📖 See [Testing Guide](testing-framework.html)** for comprehensive testing methodology
- **📖 See [Achievement Showcase](leaderboard.html)** for portfolio development and career readiness
- See [Track Your Progress](learning-progress.md) for monitoring capability development and learning progression
- See [Progress Tracking](checkpoint-system.md) for technical details on capability testing
- See [Testing Guide](testing-framework.md) for comprehensive testing methodology
---
## 🎓 Academic Courses
## Academic Courses
### Machine Learning Systems
- **[CS 329S: Machine Learning Systems Design](https://stanford-cs329s.github.io/)** (Stanford)
@@ -36,7 +35,7 @@ While TinyTorch teaches you to build complete ML systems from scratch, these res
---
## 📖 Recommended Books
## Recommended Books
### Systems & Engineering
- **[Machine Learning Systems](https://mlsysbook.ai)** by Prof. Vijay Janapa Reddi (Harvard)
@@ -57,7 +56,7 @@ While TinyTorch teaches you to build complete ML systems from scratch, these res
---
## 🛠️ Alternative Implementations
## Alternative Implementations
**Different approaches to building ML systems from scratch - see how others tackle the same challenge:**
@@ -76,31 +75,31 @@ While TinyTorch teaches you to build complete ML systems from scratch, these res
---
## 🏭 Production Internals
## Production Internals
### Framework Deep Dives
- **[PyTorch Internals](http://blog.ezyang.com/2019/05/pytorch-internals/)** by Edward Yang
*How PyTorch actually works under the hood - a great read as see what you built in TinyTorch corresponds to the real PyTorch*
- **[PyTorch Documentation: Extending PyTorch](https://pytorch.org/docs/stable/notes/extending.html)**
- **[PyTorch Documentation: Extending PyTorch](https://pytorch.org/docs/stable/notes/extending.md)**
*Custom operators and autograd functions - apply your TinyTorch knowledge*
---
*Building ML systems from scratch gives you the implementation foundation most ML engineers lack. These resources help you apply that knowledge to broader systems and production environments.*
## 🚀 Ready to Begin Your Journey?
## Ready to Begin Your Journey?
**Start with the fundamentals and build your way up.**
**📖 See [Essential Commands](tito-essentials.html)** for complete TITO command reference.
See [Essential Commands](tito-essentials.md) for complete TITO command reference.
**Your Next Steps:**
1. **📖 See [Quick Start Guide](quickstart-guide.html)** for 15-minute hands-on experience
2. **📖 See [Track Your Progress](learning-progress.html)** for understanding capability development
3. **📖 See [Course Introduction](chapters/00-introduction.html)** for deep dive into course philosophy
1. See [Quick Start Guide](quickstart-guide.md) for 15-minute hands-on experience
2. See [Track Your Progress](learning-progress.md) for understanding capability development
3. See [Course Introduction](chapters/00-introduction.md) for deep dive into course philosophy
<div style="background: #f8f9fa; border: 1px solid #dee2e6; padding: 1.5rem; border-radius: 0.5rem; margin: 2rem 0; text-align: center;">
<h4 style="margin: 0 0 1rem 0; color: #495057;">🎯 Transform from Framework User to Systems Engineer</h4>
<h4 style="margin: 0 0 1rem 0; color: #495057;">Transform from Framework User to Systems Engineer</h4>
<p style="margin: 0; color: #6c757d;">These external resources complement the hands-on systems building you'll do in TinyTorch</p>
</div>
</div>

View File

@@ -70,14 +70,14 @@ See [Milestones Guide](chapters/milestones.md) for the full progression.
TinyTorch has 20 modules organized in three tiers:
### 🏗️ Foundation (Modules 01-07)
### Foundation (Modules 01-07)
Core ML infrastructure - tensors, autograd, training loops
**Milestones unlocked:**
- M01: Perceptron (after Module 07)
- M02: XOR Crisis (after Module 07)
### 🏛️ Architecture (Modules 08-13)
### Architecture (Modules 08-13)
Neural network architectures - data loading, CNNs, transformers
**Milestones unlocked:**
@@ -85,12 +85,15 @@ Neural network architectures - data loading, CNNs, transformers
- M04: CNNs (after Module 09)
- M05: Transformers (after Module 13)
### Optimization (Modules 14-20)
Production optimization - profiling, quantization, benchmarking, capstone
### Optimization (Modules 14-19)
Production optimization - profiling, quantization, benchmarking
**Milestones unlocked:**
- M06: MLPerf (after Module 18)
### Capstone Competition (Module 20)
Apply all optimizations in the MLPerf® Edu Competition
## Typical Development Session
Here's what a typical session looks like:

View File

@@ -1,17 +1,17 @@
# 🧪 Testing Framework
# Testing Framework
```{admonition} Test-Driven ML Engineering
:class: tip
TinyTorch's testing framework ensures your implementations are not just educational, but production-ready and reliable.
```
## 🎯 Testing Philosophy: Verify Understanding Through Implementation
## Testing Philosophy: Verify Understanding Through Implementation
TinyTorch testing goes beyond checking syntax - it validates that you understand ML systems engineering through working implementations.
## Quick Start: Validate Your Implementation
## Quick Start: Validate Your Implementation
### 🚀 Run Everything (Recommended)
### Run Everything (Recommended)
```bash
# Complete validation suite
tito test --comprehensive
@@ -23,7 +23,7 @@ tito test --comprehensive
# ✅ Overall TinyTorch Health: 100.0%
```
### 🎯 Target-Specific Testing
### Target-Specific Testing
```bash
# Test what you just built
tito module complete 02_tensor && tito checkpoint test 01
@@ -35,9 +35,9 @@ tito test --module attention --verbose
tito test --performance --module training
```
## 🔬 Testing Levels: From Components to Systems
## Testing Levels: From Components to Systems
### 1. 🧩 Module-Level Testing
### 1. Module-Level Testing
**Goal**: Verify individual components work correctly in isolation
```bash
@@ -91,7 +91,7 @@ tito checkpoint test 13 # "Can I build attention mechanisms?"
tito checkpoint validate --from 00 --to 15
```
**📖 See [Complete Checkpoint System Documentation](checkpoint-system.html)** for technical implementation details.
**📖 See [Complete Checkpoint System Documentation](checkpoint-system.md)** for technical implementation details.
**Key Capability Categories:**
- **Foundation (00-03)**: Building blocks of neural networks
@@ -374,9 +374,9 @@ tito checkpoint status
```
**Testing Integration with Your Learning Path:**
- **📖 See [Track Your Progress](learning-progress.html)** for how testing fits into capability development
- **📖 See [Track Capabilities](checkpoint-system.html)** for automated testing and progress validation
- **📖 See [Showcase Achievements](leaderboard.html)** for how testing validates the skills you can claim
- **📖 See [Track Your Progress](learning-progress.md)** for how testing fits into capability development
- **📖 See [Track Capabilities](checkpoint-system.md)** for automated testing and progress validation
- **📖 See [Historical Milestones](chapters/milestones.md)** for how testing validates achievements
<div style="background: #e3f2fd; border: 2px solid #1976d2; padding: 1.5rem; border-radius: 0.5rem; margin: 2rem 0; text-align: center;">
<h4 style="margin: 0 0 1rem 0; color: #1565c0;">🎯 Testing Excellence = ML Systems Mastery</h4>

View File

@@ -13,37 +13,37 @@ TinyTorch follows a simple three-step cycle: **Edit modules → Export to packag
**The essential command**: `tito module complete MODULE_NUMBER` - exports your code to the TinyTorch package.
**📖 See [Student Workflow](student-workflow.html)** for the complete development cycle, best practices, and troubleshooting.
See [Student Workflow](student-workflow.md) for the complete development cycle, best practices, and troubleshooting.
This page documents all available TITO commands. The checkpoint system (`tito checkpoint status`) is optional for progress tracking.
## 🚀 Most Important Commands
## Most Important Commands
The commands you'll use most often:
<div style="display: grid; grid-template-columns: 1fr; gap: 1rem; margin: 2rem 0;">
<div style="background: #e3f2fd; padding: 1.5rem; border-radius: 0.5rem; border-left: 4px solid #2196f3;">
<h4 style="margin: 0 0 0.5rem 0; color: #1976d2;">📋 Check Your Environment</h4>
<h4 style="margin: 0 0 0.5rem 0; color: #1976d2;">Check Your Environment</h4>
<code style="background: #263238; color: #ffffff; padding: 0.5rem; border-radius: 0.25rem; display: block; margin: 0.5rem 0;">tito system doctor</code>
<p style="margin: 0.5rem 0 0 0; font-size: 0.9rem; color: #64748b;">Verify your setup is ready for development</p>
</div>
<div style="background: #fffbeb; padding: 1.5rem; border-radius: 0.5rem; border-left: 4px solid #f59e0b;">
<h4 style="margin: 0 0 0.5rem 0; color: #d97706;">🔨 Export Module to Package (Essential)</h4>
<h4 style="margin: 0 0 0.5rem 0; color: #d97706;">Export Module to Package (Essential)</h4>
<code style="background: #263238; color: #ffffff; padding: 0.5rem; border-radius: 0.25rem; display: block; margin: 0.5rem 0;">tito module complete 01</code>
<p style="margin: 0.5rem 0 0 0; font-size: 0.9rem; color: #64748b;">Export your module to the TinyTorch package - use this after editing modules</p>
</div>
<div style="background: #f0fdf4; padding: 1.5rem; border-radius: 0.5rem; border-left: 4px solid #22c55e;">
<h4 style="margin: 0 0 0.5rem 0; color: #15803d;">🎯 Track Your Progress (Optional)</h4>
<h4 style="margin: 0 0 0.5rem 0; color: #15803d;">Track Your Progress (Optional)</h4>
<code style="background: #263238; color: #ffffff; padding: 0.5rem; border-radius: 0.25rem; display: block; margin: 0.5rem 0;">tito checkpoint status</code>
<p style="margin: 0.5rem 0 0 0; font-size: 0.9rem; color: #64748b;">See which capabilities you've mastered (optional capability tracking)</p>
</div>
</div>
## 🔄 Typical Development Session
## Typical Development Session
Here's what a typical session looks like:
@@ -73,13 +73,13 @@ python 01_rosenblatt_forward.py # Uses YOUR implementation!
tito checkpoint status # See what you've completed
```
**📖 See [Student Workflow](student-workflow.html)** for complete development cycle details.
See [Student Workflow](student-workflow.md) for complete development cycle details.
</div>
## 📖 Complete Command Reference
## Complete Command Reference
### 🏥 System & Health
### System & Health
<div style="background: #f8f9fa; padding: 1rem; border-radius: 0.25rem; margin: 1rem 0;">
**System Check**
@@ -96,7 +96,7 @@ tito system info
</div>
### 🔨 Module Management
### Module Management
<div style="background: #f8f9fa; padding: 1rem; border-radius: 0.25rem; margin: 1rem 0;">
**Export Module to Package (Essential)**
@@ -117,7 +117,7 @@ from tinytorch.autograd import backward # YOUR implementation!
</div>
### 📊 Progress Tracking (Optional)
### Progress Tracking (Optional)
<div style="background: #f8f9fa; padding: 1rem; border-radius: 0.25rem; margin: 1rem 0;">
**Capability Overview**
@@ -146,7 +146,7 @@ tito checkpoint test CHECKPOINT_NUMBER
</div>
## 👩‍🏫 Instructor Commands (Coming Soon)
## Instructor Commands (Coming Soon)
<div style="background: #f3e5f5; padding: 1rem; border-radius: 0.25rem; margin: 1rem 0;">
@@ -154,11 +154,11 @@ TinyTorch includes NBGrader integration for classroom use. Full documentation fo
**For now, focus on the student workflow**: edit modules → export → validate with milestones.
*For current instructor capabilities, see [Classroom Use Guide](usage-paths/classroom-use.html)*
*For current instructor capabilities, see [Classroom Use Guide](usage-paths/classroom-use.md)*
</div>
## 🚨 Troubleshooting Commands
## Troubleshooting Commands
When things go wrong, these commands help:
@@ -178,7 +178,7 @@ tito checkpoint timeline # Visualize your progress
</div>
## 🚀 Ready to Build?
## Ready to Build?
<div style="background: #f8f9fa; padding: 2rem; border-radius: 0.5rem; margin: 2rem 0; text-align: center;">
<h3 style="margin: 0 0 1rem 0; color: #495057;">Start Your TinyTorch Journey</h3>
@@ -189,4 +189,4 @@ tito checkpoint timeline # Visualize your progress
---
*Master these commands and you'll build ML systems with confidence. Every command is designed to accelerate your learning and keep you focused on what matters: building production-quality ML frameworks from scratch.*
*Master these commands and you'll build ML systems with confidence. Every command is designed to accelerate your learning and keep you focused on what matters: building production-quality ML frameworks from scratch.*

View File

@@ -108,7 +108,7 @@
The TinyTorch course consists of 20 progressive modules organized into learning stages.
**📖 See [Complete Course Structure](../chapters/00-introduction.html#course-structure)** for detailed module descriptions, learning objectives, and prerequisites for each module.
**📖 See [Complete Course Structure](../chapters/00-introduction.md)** for detailed module descriptions, learning objectives, and prerequisites for each module.
---
@@ -198,9 +198,9 @@ tito nbgrader release 01_setup
## Instructor Resources
### Documentation
- [Complete Instructor Guide](../instructor-guide.md) - Detailed setup and workflow
- [Quick Reference Card](../../NBGrader_Quick_Reference.md) - Essential commands
- Module-specific teaching notes in each chapter
- Module-specific teaching notes in each ABOUT.md file
- [Course Structure](../chapters/00-introduction.md) - Full curriculum overview
- [Student Workflow](../student-workflow.md) - Essential development cycle
### Support Tools
- `tito module status --comprehensive` - System health dashboard
@@ -216,10 +216,10 @@ tito nbgrader release 01_setup
## 📞 Next Steps
1. **📖 Read the [Instructor Guide](../instructor-guide.md)** for complete details
2. **🚀 Start with Module 0: [Introduction](../chapters/00-introduction.md)** to see the system overview
3. **💻 Set up your environment** following the guide
4. **📧 Contact us** for instructor support
1. **📖 Review [Course Structure](../chapters/00-introduction.md)** for complete curriculum overview
2. **🚀 Explore [Student Workflow](../student-workflow.md)** to understand the development cycle
3. **💻 Set up your environment** using the [Quick Start Guide](../quickstart-guide.md)
4. **📧 Contact us** via GitHub Issues for instructor support
---