mirror of
https://github.com/MLSysBook/TinyTorch.git
synced 2026-05-08 07:32:31 -05:00
✨ Refined landing page with better balance and substance
Key improvements: - Added 'Learning Opportunity' section with positive framing - Expanded 'What Makes TinyTorch Different' with concrete examples - Enhanced learning philosophy with complete example cycle - Moved CTA section lower after building value and understanding - Added more substance to each section while maintaining scannability - Improved course journey descriptions with more detail - Better flow: Hook → Opportunity → Difference → Philosophy → Journey → CTA - Maintained positive tone without putting other approaches down Result: More substantial content that builds desire before asking for action.
This commit is contained in:
134
book/intro.md
134
book/intro.md
@@ -13,20 +13,13 @@
|
||||
**Result:** You become the expert others ask about "how PyTorch actually works."
|
||||
```
|
||||
|
||||
## 🚀 **Start Building Now**
|
||||
---
|
||||
|
||||
```{admonition} Choose Your Adventure
|
||||
:class: important
|
||||
## 🎯 **The Learning Opportunity**
|
||||
|
||||
### **🔬 [Quick Try (5 min)](usage-paths/quick-exploration.md)**
|
||||
*Just want to see what this is?* → Click and code in your browser. No setup needed.
|
||||
Most ML education focuses on using existing frameworks — import PyTorch, call `nn.Linear()`, run `optimizer.step()`. This approach gets you building models quickly, but leaves a gap: **how do these frameworks actually work under the hood?**
|
||||
|
||||
### **🏗️ [Build for Real (8+ weeks)](usage-paths/serious-development.md)**
|
||||
*Ready to build your own ML framework?* → Fork the repo and start your journey.
|
||||
|
||||
### **👨🏫 [Teach This Course](usage-paths/classroom-use.md)**
|
||||
*Want complete course infrastructure?* → Turn-key solution for educators.
|
||||
```
|
||||
TinyTorch fills that gap. Instead of treating frameworks as black boxes, you'll build every component yourself. The result? Deep understanding that makes you the person others come to with questions about "how PyTorch really works."
|
||||
|
||||
---
|
||||
|
||||
@@ -49,66 +42,37 @@ Traditional ML Course: TinyTorch Approach:
|
||||
Go from "How does this work?" 🤷 to "I implemented every line!" 💪
|
||||
```
|
||||
|
||||
This isn't just about learning — it's about developing the deep systems thinking that distinguishes ML engineers from ML users.
|
||||
|
||||
---
|
||||
|
||||
## 🌟 **Why TinyTorch Works**
|
||||
## 🌟 **What Makes TinyTorch Different**
|
||||
|
||||
### **🔬 Build-First Learning**
|
||||
No black boxes. Implement every component from scratch. Use YOUR code in real neural networks.
|
||||
### **🔬 Build-First Philosophy**
|
||||
You don't just learn about tensors — you implement the `Tensor` class from scratch. You don't just use ReLU — you write the activation function yourself. Every component you build becomes part of your personal ML framework that actually works on real data.
|
||||
|
||||
### **🚀 Production-Ready Skills**
|
||||
Professional workflow with `tito` CLI. Real datasets like CIFAR-10. MLOps patterns from day one.
|
||||
From day one, you'll use professional development practices: the `tito` CLI for project management, automated testing for quality assurance, real datasets like CIFAR-10 for training, and MLOps patterns for deployment. This isn't toy code — it's the foundation for production ML systems.
|
||||
|
||||
### **⚡ Instant Results**
|
||||
Code works immediately. Visual progress indicators. Watch your `ReLU` power real networks.
|
||||
Your code works immediately. Implement a `ReLU` function in Module 3, and by Module 5 you're watching it power real neural networks. Visual progress indicators and comprehensive testing ensure you always know your implementations are correct.
|
||||
|
||||
### **🎯 Progressive Mastery**
|
||||
Start simple (`hello_world()`). Build systematically. End powerful (production MLOps).
|
||||
|
||||
---
|
||||
|
||||
## 📚 **Course Journey: 14 Modules**
|
||||
|
||||
```{admonition} 🏗️ Foundation (Modules 1-5)
|
||||
:class: note
|
||||
**Weeks 1-6: Core Infrastructure**
|
||||
- **Setup**: Professional development workflow
|
||||
- **Tensors**: Multi-dimensional arrays (like NumPy, but yours!)
|
||||
- **Activations**: ReLU, Sigmoid, Tanh - the math that enables learning
|
||||
- **Layers**: Dense layers with matrix multiplication
|
||||
- **Networks**: Sequential architecture - chain layers into complete models
|
||||
```
|
||||
|
||||
```{admonition} 🧠 Deep Learning (Modules 6-10)
|
||||
:class: note
|
||||
**Weeks 7-12: Complete Training Systems**
|
||||
- **CNNs**: Convolutional operations for computer vision
|
||||
- **DataLoader**: CIFAR-10 loading, batching, preprocessing
|
||||
- **Autograd**: Automatic differentiation engine (PyTorch's "magic")
|
||||
- **Optimizers**: SGD, Adam, learning rate scheduling
|
||||
- **Training**: Loss functions, metrics, complete orchestration
|
||||
```
|
||||
|
||||
```{admonition} ⚡ Production (Modules 11-14)
|
||||
:class: note
|
||||
**Weeks 13-16: Real-World Deployment**
|
||||
- **Compression**: Model pruning and quantization (75% size reduction)
|
||||
- **Kernels**: High-performance custom operations
|
||||
- **Benchmarking**: Systematic evaluation and performance measurement
|
||||
- **MLOps**: Production monitoring, continuous learning, complete pipeline
|
||||
```
|
||||
Start simple with a `hello_world()` function, build systematically through tensors and layers, and end with production MLOps systems. Each module builds on previous work, creating a complete learning journey from foundations to advanced systems.
|
||||
|
||||
---
|
||||
|
||||
## 🎓 **Learning Philosophy: Build → Use → Master**
|
||||
|
||||
Every component follows the same powerful learning cycle:
|
||||
|
||||
### **Example: Activation Functions**
|
||||
|
||||
**🔧 Build:** Implement ReLU from scratch
|
||||
```python
|
||||
def relu(x):
|
||||
# YOU implement this function
|
||||
return ??? # What should this be?
|
||||
return np.maximum(0, x) # Your solution
|
||||
```
|
||||
|
||||
**🚀 Use:** Immediately use your own code
|
||||
@@ -128,13 +92,69 @@ model = Sequential([
|
||||
])
|
||||
```
|
||||
|
||||
This pattern repeats for every component - you build it, use it immediately, then see how it fits into larger systems.
|
||||
This pattern repeats for every component — tensors, layers, optimizers, even MLOps systems. You build it, use it immediately, then see how it fits into larger systems.
|
||||
|
||||
---
|
||||
|
||||
## 📚 **Academic Foundation**
|
||||
## 📚 **Course Journey: 14 Modules**
|
||||
|
||||
TinyTorch grew out of CS249r: Tiny Machine Learning Systems at Harvard University. While the [Machine Learning Systems book](https://mlsysbook.ai) covers broad principles, TinyTorch gives you hands-on implementation experience.
|
||||
```{admonition} 🏗️ Foundation (Modules 1-5)
|
||||
:class: note
|
||||
**Weeks 1-6: Core Infrastructure**
|
||||
- **Setup**: Professional development workflow with `tito` CLI and testing
|
||||
- **Tensors**: Multi-dimensional arrays with operations (like NumPy, but yours!)
|
||||
- **Activations**: ReLU, Sigmoid, Tanh — the mathematical functions that enable learning
|
||||
- **Layers**: Dense layers with matrix multiplication and weight management
|
||||
- **Networks**: Sequential architecture — chain layers into complete models
|
||||
```
|
||||
|
||||
```{admonition} 🧠 Deep Learning (Modules 6-10)
|
||||
:class: note
|
||||
**Weeks 7-12: Complete Training Systems**
|
||||
- **CNNs**: Convolutional operations for computer vision applications
|
||||
- **DataLoader**: CIFAR-10 loading, batching, and preprocessing pipelines
|
||||
- **Autograd**: Automatic differentiation engine (the "magic" behind PyTorch)
|
||||
- **Optimizers**: SGD with momentum, Adam with adaptive learning rates
|
||||
- **Training**: Loss functions, metrics, and complete training orchestration
|
||||
```
|
||||
|
||||
```{admonition} ⚡ Production (Modules 11-14)
|
||||
:class: note
|
||||
**Weeks 13-16: Real-World Deployment**
|
||||
- **Compression**: Model pruning and quantization for 75% size reduction
|
||||
- **Kernels**: High-performance custom operations and optimization
|
||||
- **Benchmarking**: Systematic evaluation and performance measurement
|
||||
- **MLOps**: Production monitoring, continuous learning, complete pipeline
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🚀 **Choose Your Learning Path**
|
||||
|
||||
```{admonition} Three Ways to Engage with TinyTorch
|
||||
:class: important
|
||||
|
||||
### **🔬 [Quick Exploration](usage-paths/quick-exploration.md)** *(5 minutes)*
|
||||
*"I want to see what this is about"*
|
||||
- Click and run code immediately in your browser (Binder)
|
||||
- No installation or setup required
|
||||
- Implement ReLU, tensors, neural networks interactively
|
||||
- Perfect for getting a feel for the course
|
||||
|
||||
### **🏗️ [Serious Development](usage-paths/serious-development.md)** *(8+ weeks)*
|
||||
*"I want to build this myself"*
|
||||
- Fork the repo and work locally with full development environment
|
||||
- Build complete ML framework from scratch with `tito` CLI
|
||||
- 14 progressive assignments from setup to production MLOps
|
||||
- Professional development workflow with automated testing
|
||||
|
||||
### **👨🏫 [Classroom Use](usage-paths/classroom-use.md)** *(Instructors)*
|
||||
*"I want to teach this course"*
|
||||
- Complete course infrastructure with NBGrader integration
|
||||
- Automated grading for comprehensive testing
|
||||
- Flexible pacing (8-16 weeks) with proven pedagogical structure
|
||||
- Turn-key solution for ML systems education
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
@@ -142,3 +162,9 @@ TinyTorch grew out of CS249r: Tiny Machine Learning Systems at Harvard Universit
|
||||
|
||||
### **Quick Taste: Try Module 1 Right Now**
|
||||
Want to see what TinyTorch feels like? **[Launch the Setup chapter](chapters/01-setup.md)** in Binder and implement your first TinyTorch function in 2 minutes!
|
||||
|
||||
---
|
||||
|
||||
## 📚 **Academic Foundation**
|
||||
|
||||
TinyTorch grew out of CS249r: Tiny Machine Learning Systems at Harvard University. While the [Machine Learning Systems book](https://mlsysbook.ai) covers broad principles, TinyTorch gives you hands-on implementation experience.
|
||||
|
||||
Reference in New Issue
Block a user